我的logstash过滤器配置如下:
filter {
grok {
patterns_dir => ["/usr/share/logstash/pipeline/patterns/"]
match => {
"[message]" => "%{TIMESTAMP_ISO8601:timestamp} %{THREAD:thread} %{LOGLEVEL:level} %{LOGGER:logger} %{CONTEXT:context} - %{GREEDYDATA:message}"
}
}
mutate {
rename => { "[fields][index]" => "application" }
rename => { "[host][name]" => "instance" }
remove_field => ["@version","agent.ephemeral_id","agent","ecs","fields","input","tags"]
}
}Grok调试器建议一切正常,对于错误行:
2020-10-28 05:14:41,282 [Worker-5] DEBUG Amount - calculate operation: [1], useCurrencyCodeOfPosition: [false]我得到了以下输出:
{
"level": "DEBUG",
"logger": "Amount",
"context": "",
"thread": "Worker-5",
"message": "calculate operation: [1], useCurrencyCodeOfPosition: [false]",
"timestamp": "2020-10-28 05:14:41,282"
}模式的定义如下:
THREAD \[(?<thread>[^\]]*)\]
LOGGER (?<logger>[^ ]*)
CONTEXT (?<context>[^-]*)现在,grok筛选器产生的每个值都被复制,如下例所示:
"logger" => [
[0] "Amount",
[1] "Amount"
],
"thread" => [
[0] "[Worker-5]",
[1] "Worker-5"这里有什么问题?我就是搞不懂。这是我的第一个过滤器:)。我和Logstash 7.9.2 (博士学位)一起工作
发布于 2020-10-28 18:28:34
我认为过滤器中的自定义模式存在问题。您想要的也可以简单地使用下面的开箱即用模式来实现。
filter{
grok {
match => {
"message" => "%{TIMESTAMP_ISO8601:timestamp}%{SPACE}\[%{DATA:thread}\]%{SPACE}%{LOGLEVEL:level}%{SPACE}%{NOTSPACE:logger} %{DATA:context}-%{SPACE}%{GREEDYDATA:message}"
}
overwrite => [ "message" ]
}
mutate {
rename => { "[fields][index]" => "application" }
rename => { "[host][name]" => "instance" }
remove_field => ["@version","agent.ephemeral_id","agent","ecs","fields","input","tags"]
}
}签出默认grok模式的这链接。如果您需要对这些事件进行时间序列分析,我建议您使用@timestamp重写timestamp,或者至少在timestamp上应用日期筛选器。
如果您希望捕获多行堆栈跟踪错误,请考虑在输入插件上使用多行筛选器。
https://stackoverflow.com/questions/64573036
复制相似问题