你如何在 logstash 中创建基于 grok 的过滤器
how do you create a filter based on grok in logstash
我正在尝试使用 logstash 将此条目插入到 elasticsearch 中:
2016-05-18 00:14:30,915 DEBUG http-bio-/158.134.18.57-8200-exec-1, HTTPReport - Saved report job 1000 for report
2016-05-18 00:14:30,937 DEBUG http-bio-/158.134.18.57-8200-exec-1, JavaReport -
************************************************************************************************
Report Job information
Job ID : 12000
Job name : 101
Job priority : 1
Job group : BACKGROUND
Report : Month End
2016-05-18 00:17:38,868 DEBUG JobsMaintenanceScheduler_Worker-1, DailyReport - System information: available processors = 12; memory status : 2638 MB of 4096 MB
我在 logstash conf 文件中有这个过滤器:
input {
file {
path => "/data/*.log"
type => "app_log"
start_position => "beginning"
}
}
filter {
multiline {
pattern => "(([\s]+)20[0-9]{2}-)|20[0-9]{2}-"
negate => true
what => "previous"
}
if [type] == "app_log" {
grok {
patterns_dir => ["/pattern"]
match => {"message" => "%{TIMESTAMP_ISO8601:timestamp},%{NUMBER:Num_field} %{WORD:error_level} %{GREEDYDATA:origin}, %{WORD:logger} - %{GREEDYDATA:event%}"}
}
}
mutate { add_field => {"type" => "app_log"}}
mutate { add_field => {"machine_name" => "server101"}}
}
output {
elasticsearch {
hosts=> "localhost:9200"
index => "app_log-%{+YYYY.MM.dd}"
manage_template => false
}
}
我收到这个错误:
translation missing: en.logstash.runner.configuration.file-not-found {:level=>:error}
无法插入。有什么想法可能是错误的吗?
升级到最新版本的 Logstash (= 2.3.2),像下面那样修复你的 grok 过滤器,它就会工作:
grok {
add_field => {"machine_name" =>"server010"}
match =>{"message" => "%{TIMESTAMP_ISO8601:timestamp} %{WORD:error_level} %{DATA:origin}, %{DATA:logger_name} - %{GREEDYDATA:EVENT}"}
}
更新
我正在尝试使用 logstash 将此条目插入到 elasticsearch 中:
2016-05-18 00:14:30,915 DEBUG http-bio-/158.134.18.57-8200-exec-1, HTTPReport - Saved report job 1000 for report
2016-05-18 00:14:30,937 DEBUG http-bio-/158.134.18.57-8200-exec-1, JavaReport -
************************************************************************************************
Report Job information
Job ID : 12000
Job name : 101
Job priority : 1
Job group : BACKGROUND
Report : Month End
2016-05-18 00:17:38,868 DEBUG JobsMaintenanceScheduler_Worker-1, DailyReport - System information: available processors = 12; memory status : 2638 MB of 4096 MB
我在 logstash conf 文件中有这个过滤器:
input {
file {
path => "/data/*.log"
type => "app_log"
start_position => "beginning"
}
}
filter {
multiline {
pattern => "(([\s]+)20[0-9]{2}-)|20[0-9]{2}-"
negate => true
what => "previous"
}
if [type] == "app_log" {
grok {
patterns_dir => ["/pattern"]
match => {"message" => "%{TIMESTAMP_ISO8601:timestamp},%{NUMBER:Num_field} %{WORD:error_level} %{GREEDYDATA:origin}, %{WORD:logger} - %{GREEDYDATA:event%}"}
}
}
mutate { add_field => {"type" => "app_log"}}
mutate { add_field => {"machine_name" => "server101"}}
}
output {
elasticsearch {
hosts=> "localhost:9200"
index => "app_log-%{+YYYY.MM.dd}"
manage_template => false
}
}
我收到这个错误:
translation missing: en.logstash.runner.configuration.file-not-found {:level=>:error}
无法插入。有什么想法可能是错误的吗?
升级到最新版本的 Logstash (= 2.3.2),像下面那样修复你的 grok 过滤器,它就会工作:
grok {
add_field => {"machine_name" =>"server010"}
match =>{"message" => "%{TIMESTAMP_ISO8601:timestamp} %{WORD:error_level} %{DATA:origin}, %{DATA:logger_name} - %{GREEDYDATA:EVENT}"}
}
更新