Logstash Grok 错误
Logstash Grok error
My logstash configuration is giving me this error:
每当我 运行 这个命令:/opt/logstash/bin/logstash -f /etc/logstash/conf.d/logstash.conf --auto-reload --debug
reason=>"Expected one of #, {, ,, ] at line 27, column 95 (byte 677) after filter {\n\n\tif [type] == \"s3\" {\n\t\tgrok {\n\t\n \t\t\tmatch => [\"message\", \"%{IP:client} %{USERNAME} %{USERNAME} \[%{HTTPDATE:timestamp}\] (?:\"", :level=>:error, :file=>"logstash/agent.rb", :line=>"430", :method=>"create_pipeline"}
这与我的 pattern.But 有关,当我在 Grok 在线调试器中检查相同时,它为我提供了所需的 answer.Please 帮助。
Here is my logstash configuration:
input {
s3 {
access_key_id => ""
bucket => ""
region => ""
secret_access_key => ""
prefix => "access"
type => "s3"
add_field => { source => gzfiles }
sincedb_path => "/dev/null"
#path => "/home/shubham/logstash.json"
#temporary_directory => "/home/shubham/S3_temp/"
backup_add_prefix => "logstash-backup"
backup_to_bucket => "logstash-nginx-overcart"
}
}
filter {
if [type] == "s3" {
grok {
match => ["message", "%{IP:client} %{USERNAME} %{USERNAME} \[%{HTTPDATE:timestamp}\] (?:"%{WORD:request}
%{URIPATHPARAM:path} HTTP/%{NUMBER:version}" %{NUMBER:reponse} %{NUMBER:bytes} "%{USERNAME}" %{GREEDYDATA:responseMessage})"]
}
}
}
output {
elasticsearch {
hosts => ''
index => "accesslogs"
}
}
你的 match 赋值中有几个未转义的 " 字符(例如围绕用户名 var),这会使解析器出错。如果你用 \ 转义它们应该可以。
My logstash configuration is giving me this error:
每当我 运行 这个命令:/opt/logstash/bin/logstash -f /etc/logstash/conf.d/logstash.conf --auto-reload --debug
reason=>"Expected one of #, {, ,, ] at line 27, column 95 (byte 677) after filter {\n\n\tif [type] == \"s3\" {\n\t\tgrok {\n\t\n \t\t\tmatch => [\"message\", \"%{IP:client} %{USERNAME} %{USERNAME} \[%{HTTPDATE:timestamp}\] (?:\"", :level=>:error, :file=>"logstash/agent.rb", :line=>"430", :method=>"create_pipeline"}
这与我的 pattern.But 有关,当我在 Grok 在线调试器中检查相同时,它为我提供了所需的 answer.Please 帮助。
Here is my logstash configuration:
input {
s3 {
access_key_id => ""
bucket => ""
region => ""
secret_access_key => ""
prefix => "access"
type => "s3"
add_field => { source => gzfiles }
sincedb_path => "/dev/null"
#path => "/home/shubham/logstash.json"
#temporary_directory => "/home/shubham/S3_temp/"
backup_add_prefix => "logstash-backup"
backup_to_bucket => "logstash-nginx-overcart"
}
}
filter {
if [type] == "s3" {
grok {
match => ["message", "%{IP:client} %{USERNAME} %{USERNAME} \[%{HTTPDATE:timestamp}\] (?:"%{WORD:request}
%{URIPATHPARAM:path} HTTP/%{NUMBER:version}" %{NUMBER:reponse} %{NUMBER:bytes} "%{USERNAME}" %{GREEDYDATA:responseMessage})"]
}
}
}
output {
elasticsearch {
hosts => ''
index => "accesslogs"
}
}
你的 match 赋值中有几个未转义的 " 字符(例如围绕用户名 var),这会使解析器出错。如果你用 \ 转义它们应该可以。