Logstash - 无法让@timestamp 工作
Logstash - not able to get the @timestamp to work
这里是 Logstash 菜鸟,我正在尝试通过 logstash 过滤这些日志行。
2015-03-31 02:53:39 INFO This is info message 5
我使用的配置文件是这样的:
input {
file {
path => "/sample/log4j_log.log"
start_position => beginning
}
}
filter {
grok {
match => [ "message" , "%{DATESTAMP:logtimestamp} %{LOGLEVEL:level} %{GREEDYDATA:msg}" ]
}
date {
locale => "en"
match => [ "logtimestamp" , "yyyy-MM-dd HH:mm:ss" ]
}
}
output {
#elasticsearch { host => localhost }
stdout { codec => rubydebug }
}
我得到的输出是
"message" => "2015-03-31 02:53:39 INFO This is info message 5",
"@version" => "1",
"@timestamp" => "0015-03-30T21:00:11.000Z",
"host" => "abc",
"path" => "/sample/log4j_log.log",
"logtimestamp" => "15-03-31 02:53:39",
"level" => "INFO",
"msg" => " This is info message 5"
我看到 logtimestamp 字段显示格式为 "YY-MM-dd HH:mm:ss",我不确定为什么它会转换成他的格式,我什至在日期过滤器中尝试过。
在那些情况下,我得到这个输出。
{
"message" => "2015-03-31 02:53:39 INFO This is info message 5",
"@version" => "1",
"@timestamp" => "2015-04-07T17:55:51.231Z",
"host" => "abc",
"path" => "/sample/log4j_log.log",
"logtimestamp" => "15-03-31 02:53:39",
"level" => "INFO",
"msg" => " This is info message 5"
}
在所有这一切中,@timestamp 与实际的日志事件时间戳不匹配,这会导致弹性搜索 + kibana 可视化出现问题。
我尝试按照 Whosebug 上其他问题的建议包含目标 =>“@timestamp”、区域设置 => "en",但没有成功。
我似乎唯一没有尝试过的是:
Logstash date parsing as timestamp using the date filter
我不认为这完全适用于我的日志事件。
你的 grok 模式不正确。
请改成这样,用TIMESTAMP_ISO8601
代替DATESTAMP
grok {
match => [ "message" , "%{TIMESTAMP_ISO8601:logtimestamp} %{LOGLEVEL:level} %{GREEDYDATA:msg}" ]
}
这是输出:
{
"message" => "2015-03-31 02:53:39 INFO This is info message 5",
"@version" => "1",
"@timestamp" => "2015-03-30T18:53:39.000Z",
"host" => "BEN_LIM",
"logtimestamp" => "2015-03-31 02:53:39",
"level" => "INFO",
"msg" => " This is info message 5"
}
这里是 Logstash 菜鸟,我正在尝试通过 logstash 过滤这些日志行。
2015-03-31 02:53:39 INFO This is info message 5
我使用的配置文件是这样的:
input {
file {
path => "/sample/log4j_log.log"
start_position => beginning
}
}
filter {
grok {
match => [ "message" , "%{DATESTAMP:logtimestamp} %{LOGLEVEL:level} %{GREEDYDATA:msg}" ]
}
date {
locale => "en"
match => [ "logtimestamp" , "yyyy-MM-dd HH:mm:ss" ]
}
}
output {
#elasticsearch { host => localhost }
stdout { codec => rubydebug }
}
我得到的输出是
"message" => "2015-03-31 02:53:39 INFO This is info message 5",
"@version" => "1",
"@timestamp" => "0015-03-30T21:00:11.000Z",
"host" => "abc",
"path" => "/sample/log4j_log.log",
"logtimestamp" => "15-03-31 02:53:39",
"level" => "INFO",
"msg" => " This is info message 5"
我看到 logtimestamp 字段显示格式为 "YY-MM-dd HH:mm:ss",我不确定为什么它会转换成他的格式,我什至在日期过滤器中尝试过。 在那些情况下,我得到这个输出。
{
"message" => "2015-03-31 02:53:39 INFO This is info message 5",
"@version" => "1",
"@timestamp" => "2015-04-07T17:55:51.231Z",
"host" => "abc",
"path" => "/sample/log4j_log.log",
"logtimestamp" => "15-03-31 02:53:39",
"level" => "INFO",
"msg" => " This is info message 5"
}
在所有这一切中,@timestamp 与实际的日志事件时间戳不匹配,这会导致弹性搜索 + kibana 可视化出现问题。
我尝试按照 Whosebug 上其他问题的建议包含目标 =>“@timestamp”、区域设置 => "en",但没有成功。
我似乎唯一没有尝试过的是: Logstash date parsing as timestamp using the date filter 我不认为这完全适用于我的日志事件。
你的 grok 模式不正确。
请改成这样,用TIMESTAMP_ISO8601
代替DATESTAMP
grok {
match => [ "message" , "%{TIMESTAMP_ISO8601:logtimestamp} %{LOGLEVEL:level} %{GREEDYDATA:msg}" ]
}
这是输出:
{
"message" => "2015-03-31 02:53:39 INFO This is info message 5",
"@version" => "1",
"@timestamp" => "2015-03-30T18:53:39.000Z",
"host" => "BEN_LIM",
"logtimestamp" => "2015-03-31 02:53:39",
"level" => "INFO",
"msg" => " This is info message 5"
}