我的配置如下:使用filebeat和logstash将应用实例的日志转发到elastic search。
Apps
+--------+
| +--------+
| | +--------+ +----------+ +----------+
| | | +--- | | | | |
+ | | |file| --> | logstash | -->| elastic |
+ | |beat| | (1) | | search |
+--------+ +----------+ +----------+
| |
(not avail) X | (query)
V |
+----------+ V
| | +------+
| logstash |<-----| Json |
| (2) | | file |
+----------+ +------+我想测试logstash-2中的日志处理,但目前无法实现从logstash-1转发。因此,我尝试了以下操作:查询elasticsearch并检索文档的_source字段,我得到了一些如下的json文档:
{
"@timestamp": <timestamp>,
"@version": "1",
"requestMethod": "PUT",
"requestUri": "/api/endopoint",
"servername": "myserver"
.... many other fields
}
{
"@timestamp": <timestamp>,
"@version": "1",
}
... many other json objects我的问题是,如何使用logstash处理来自elasticsearch查询的这些json文档?
我尝试使用多行编解码器,然后使用json过滤器来处理它们,但无法使其工作:下面是一次尝试:
input {
file {
path => "events.json"
sincedb_path => "/dev/null"
start_position => beginning
codec => multiline {
pattern => "^\}" #end of each json object
what => "previous"
}
}
}
filter {
json {
source => "event"
}
}
output {
stdout{}
}发布于 2018-09-26 22:26:49
经过一些额外的研究,我意识到多行编解码器配置是错误的。我已经修复了,现在我在消息字段上有整个事件。
input {
file {
path => "events.json"
sincedb_path => "/dev/null"
start_position => beginning
codec => multiline {
pattern => "^\}" #end of each json object
negate => true
what => "next"
}
}
}
filter {
json {
source => "message"
}
mutate {
remove_field => ["message"]
}
}
output {
stdout{}
}https://stackoverflow.com/questions/52516118
复制相似问题