My logstash configuration is giving me this error:每当我运行以下命令时: /opt/logstash/bin/logstash -f /etc/logstash/conf.d/logstash.conf -auto-调试
reason=>"Expected one of #, {, ,, ] at line 27, column 95 (byte 677) after filter {\n\n\tif [type] == \"s3\" {\n\t\tgrok {\n\t\n \t\t\tmatch => [\"message\", \"%{IP:client} %{USERNAME} %{USERNAME} \\[%{HTTPDATE:timestamp}\\] (?:\"", :level=>:error, :file=>"logstash/agent.rb", :line=>"430", :method=>"create_pipeline"}这与我在Grok调试器中检查的pattern.But有关,它为我提供了所需的answer.Please帮助。
Here is my logstash configuration:
input {
s3 {
access_key_id => ""
bucket => ""
region => ""
secret_access_key => ""
prefix => "access"
type => "s3"
add_field => { source => gzfiles }
sincedb_path => "/dev/null"
#path => "/home/shubham/logstash.json"
#temporary_directory => "/home/shubham/S3_temp/"
backup_add_prefix => "logstash-backup"
backup_to_bucket => "logstash-nginx-overcart"
}
}
filter {
if [type] == "s3" {
grok {
match => ["message", "%{IP:client} %{USERNAME} %{USERNAME} \[%{HTTPDATE:timestamp}\] (?:"%{WORD:request}
%{URIPATHPARAM:path} HTTP/%{NUMBER:version}" %{NUMBER:reponse} %{NUMBER:bytes} "%{USERNAME}" %{GREEDYDATA:responseMessage})"]
}
}
}
output {
elasticsearch {
hosts => ''
index => "accesslogs"
}
}发布于 2017-03-13 20:25:39
在您的匹配分配中有几个未转义的“字符”(例如,在用户名var周围),这些字符跳过解析器。如果您用\\转义那些字符,它应该可以工作。
https://stackoverflow.com/questions/42772051
复制相似问题