我的日志模式的 Grok 过滤器
Grok filter for my log pattern
我正在试验 ELK。我尝试将具有以下模式的日志输入到 logstash
14:25:43.324 [http-nio-9090-exec-116] INFO com.app.MainApp - Request has been detected
我在 logstash.conf
中尝试了以下 grok 模式作为过滤器
match => { “message” => [ “ (?<timestamp>%{HOUR}:%{MINUTE}:%{SECOND}) \[%{NOTSPACE:thread}\] %{LOGLEVEL:loglevel} %{DATA:Class}\- %{GREEDYDATA:message}“ ]}
match => { “message” => [ “ %{TIME:timestamp} \[%{NOTSPACE:thread}\] %{LOGLEVEL:loglevel} %{DATA:Class}\- %{GREEDYDATA:message}“ ]}
但是当我向 logstash 输入日志时,出现以下错误
[0] "_grokparsefailure"
有人可以为上述日志模式建议正确的 grok 过滤器吗?
删除开头 space 后,此解析错误得到修复。所以删除 space 后的工作 logstash.conf 如下所示
input {
file {
path => ["./debug.log"]
codec => multiline {
# Grok pattern names are valid! :)
pattern => "^%{TIMESTAMP_ISO8601} "
negate => true
what => previous
}
}
}
filter {
grok {
match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} \[%{NOTSPACE:uid}\] \[%{NOTSPACE:thread}\] %{LOGLEVEL:loglevel} %{DATA:class}\-%{GREEDYDATA:message}" ]
overwrite => [ "message" ]
}
date {
match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss" ]
}
}
output {
elasticsearch { hosts => localhost }
stdout { codec => rubydebug }
}
我正在试验 ELK。我尝试将具有以下模式的日志输入到 logstash
14:25:43.324 [http-nio-9090-exec-116] INFO com.app.MainApp - Request has been detected
我在 logstash.conf
中尝试了以下 grok 模式作为过滤器match => { “message” => [ “ (?<timestamp>%{HOUR}:%{MINUTE}:%{SECOND}) \[%{NOTSPACE:thread}\] %{LOGLEVEL:loglevel} %{DATA:Class}\- %{GREEDYDATA:message}“ ]}
match => { “message” => [ “ %{TIME:timestamp} \[%{NOTSPACE:thread}\] %{LOGLEVEL:loglevel} %{DATA:Class}\- %{GREEDYDATA:message}“ ]}
但是当我向 logstash 输入日志时,出现以下错误
[0] "_grokparsefailure"
有人可以为上述日志模式建议正确的 grok 过滤器吗?
删除开头 space 后,此解析错误得到修复。所以删除 space 后的工作 logstash.conf 如下所示
input {
file {
path => ["./debug.log"]
codec => multiline {
# Grok pattern names are valid! :)
pattern => "^%{TIMESTAMP_ISO8601} "
negate => true
what => previous
}
}
}
filter {
grok {
match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} \[%{NOTSPACE:uid}\] \[%{NOTSPACE:thread}\] %{LOGLEVEL:loglevel} %{DATA:class}\-%{GREEDYDATA:message}" ]
overwrite => [ "message" ]
}
date {
match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss" ]
}
}
output {
elasticsearch { hosts => localhost }
stdout { codec => rubydebug }
}