我已经在Debian Wheezy上做了一个可以工作的ELK栈,并设置了Nxlog来收集windows日志。我在Kibana看到了日志-一切都很好,但我得到了太多的数据,并希望通过删除一些不需要的字段来过滤这些数据。
我做了一个过滤器部分,但它根本不起作用。可能的原因是什么?上面的过滤器
input {
tcp {
type => "eventlog"
port => 3515
format => "json"
}
}
filter {
type => "eventlog"
mutate {
remove => { "Hostname", "Keywords", "SeverityValue", "Severity", "SourceName", "ProviderGuid" }
remove => { "Version", "Task", "OpcodeValue", "RecordNumber", "ProcessID", "ThreadID", "Channel" }
remove => { "Category", "Opcode", "SubjectUserSid", "SubjectUserName", "SubjectDomainName" }
remove => { "SubjectLogonId", "ObjectType", "IpPort", "AccessMask", "AccessList", "AccessReason" }
remove => { "EventReceivedTime", "SourceModuleName", "SourceModuleType", "@version", "type" }
remove => { "_index", "_type", "_id", "_score", "_source", "KeyLength", "TargetUserSid" }
remove => { "TargetDomainName", "TargetLogonId", "LogonType", "LogonProcessName", "AuthenticationPackageName" }
remove => { "LogonGuid", "TransmittedServices", "LmPackageName", "ProcessName", "ImpersonationLevel" }
}
}
output {
elasticsearch {
cluster => "wisp"
node_name => "io"
}
}发布于 2015-04-10 17:37:15
我认为您会尝试删除某些日志中不存在的字段。你所有的日志都包含了你想要删除的所有字段吗?如果不是,则必须在删除字段之前识别日志。您的过滤器配置将如下所示:
filter {
type => "eventlog"
if [somefield] == "somevalue" {
mutate {
remove => { "specificfieldtoremove1", "specificfieldtoremove2" }
}
}
}https://stackoverflow.com/questions/29556523
复制相似问题