我们使用Serilog HTTP接收器向Logstash发送消息。但是HTTP消息体如下所示:
{
"events": [
{
"Timestamp": "2016-11-03T00:09:11.4899425+01:00",
"Level": "Debug",
"MessageTemplate": "Logging {@Heartbeat} from {Computer}",
"RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
"Properties": {
"Heartbeat": {
"UserName": "Mike",
"UserDomainName": "Home"
},
"Computer": "Workstation"
}
},
{
"Timestamp": "2016-11-03T00:09:12.4905685+01:00",
"Level": "Debug",
"MessageTemplate": "Logging {@Heartbeat} from {Computer}",
"RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
"Properties": {
"Heartbeat": {
"UserName": "Mike",
"UserDomainName": "Home"
},
"Computer": "Workstation"
}
}
]
}即。日志事件是在数组中批处理的。可以一个一个地发送消息,但它仍然是一个项目数组。
然后,该事件在Kibana中显示为具有值的字段message。
{
"events": [
{
// ...
},
{
// ...
}
]
}即。实际上,HTTP输入带来了什么。
如何将events数组中的项拆分为单个日志事件,并将属性“拉”到顶层,以便在ElasticSearch中有两个日志事件:
"Timestamp": "2016-11-03T00:09:11.4899425+01:00",
"Level": "Debug",
"MessageTemplate": "Logging {@Heartbeat} from {Computer}",
"RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
"Properties": {
"Heartbeat": {
"UserName": "Mike",
"UserDomainName": "Home"
},
"Computer": "Workstation"
} "Timestamp": "2016-11-03T00:09:12.4905685+01:00",
"Level": "Debug",
"MessageTemplate": "Logging {@Heartbeat} from {Computer}",
"RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
"Properties": {
"Heartbeat": {
"UserName": "Mike",
"UserDomainName": "Home"
},
"Computer": "Workstation"
}发布于 2017-10-23 13:37:00
升级到Logstash5.0之后,瓦尔氏溶液由于事件API中的更改而停止工作:更新event.to_hash没有反映在原始的event中。对于Logstash,必须使用5.0+、event.get('field')和event.set('field', value)访问器。
现在更新的解决方案是:
input {
http {
port => 8080
codec => json
}
}
filter {
split {
field => "events"
}
ruby {
code => "
event.get('events').each do |k, v|
event.set(k, v)
end
"
}
mutate {
remove_field => [ "events" ]
}
}发布于 2017-01-20 04:53:12
您可以使用附加的ruby过滤器从子结构中提取字段,从而达到预期的效果:
filter {
split {
field => "events"
}
ruby {
code => "
event.to_hash.update(event['events'].to_hash)
event.to_hash.delete_if {|k, v| k == 'events'}
"
}
}由此产生的事件如下所示:
{
"@version" => "1",
"@timestamp" => "2017-01-20T04:51:39.223Z",
"host" => "iMac.local",
"Timestamp" => "2016-11-03T00:09:12.4905685+01:00",
"Level" => "Debug",
"MessageTemplate" => "Logging {@Heartbeat} from {Computer}",
"RenderedMessage" => "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
"Properties" => {
"Heartbeat" => {
"UserName" => "Mike",
"UserDomainName" => "Home"
},
"Computer" => "Workstation"
}
}发布于 2020-01-24 07:30:47
现在可以通过设置batchFormatter来实现这一点。默认批处理格式化程序将创建错误事件,但ArrayBatchFormatter将修复以下问题:
logger.WriteTo.DurableHttpUsingFileSizeRolledBuffers(
requestUri: new Uri($"http://{elasticHost}:{elasticPort}").ToString(),
batchFormatter: new ArrayBatchFormatter());https://stackoverflow.com/questions/41746502
复制相似问题