FileBeat输出日志到kafka仅保留message有效信息
2022-01-26 本文已影响0人
走在冷风中吧
file beat配置如下
filebeat.modules:
filebeat.prospectors:
- input_type: log
paths:
- /home/byxf-dcp-v2/jetty/logs/dcp-core-v2/dcp-core-v2_es.log
document_type: log_bus
- input_type: log
paths:
- /home/byxf-dcp-v2/jetty/logs/dcp-core-v2/dcp-core-v2_track.log
document_type: track_log
output.kafka:
enabled: true
#hosts: ["172.16.217:9200"]
hosts: ["172.16.2.63:6667","172.16.2.64:6667","172.16.2.65:6667"]
topic: '%{[type]}'
version: 0.10.0
max_message_bytes: 100000000
max_procs: 1
logging.to_files: true
logging.files:
filebeat监听两个不同文件,配置了不同的topic。
其中topic:'%{[type]}' ,topic值为input中配置的document_type。
kafka中收集到日志如下格式:
{
"@timestamp": "2022-01-26T08:16:58.061Z",
"beat": {
"hostname": "t2-27-212",
"name": "t2-27-212",
"version": "5.6.1"
},
"input_type": "log",
"message": "dddd",
"offset": 83,
"source": "/home/byxf-dcp-v2/jetty/logs/dcp-core-v2/dcp-core-v2_track.log",
"type": "track_log"
}
只有message中才为写入到,其余字段为filebeat携带的元数据。
我们在处理filebeat消息时,往往只关注message信息,为了让信息更加简洁化,我们可以在filebeat中进行设置processors。完整配置文件如下:
filebeat.modules:
filebeat.prospectors:
- input_type: log
paths:
- /home/byxf-dcp-v2/jetty/logs/dcp-core-v2/dcp-core-v2_es.log
document_type: log_bus
- input_type: log
paths:
- /home/byxf-dcp-v2/jetty/logs/dcp-core-v2/dcp-core-v2_track.log
document_type: track_log
processors:
- drop_fields:
fields: ["log_type", "input_type", "offset", "beat", "source","type"]
output.kafka:
enabled: true
#hosts: ["172.16.217:9200"]
hosts: ["172.16.2.63:6667","172.16.2.64:6667","172.16.2.65:6667"]
topic: '%{[type]}'
version: 0.10.0
max_message_bytes: 100000000
max_procs: 1
logging.to_files: true
logging.files:
配置完后日志输出格式如下:
{
"@timestamp":"2022-01-26T08:52:02.717Z",
"message":"test lili id4 ",
"type":"log_bus"
}
注意:processors是对所有input源都生效的,所以再新增配置时需注意是否影响历史数据源读取。