Logstash 未对 Apache 日志应用过滤器
Logstash not applying filter to Apache logs
我正在尝试使用 ELK 堆栈解析一些 Apache 访问日志,但我遇到了 logstash 的问题,没有应用我在任何 Apache 日志上创建的 Apache 过滤器。
这是我的过滤器文件:
filter {
if [type] == "apache_access" {
grok {
patterns_dir => ["/opt/logstash/patterns/apache"]
add_tag => ["grokked", "apache"]
match => ["messege", "%{IP:client} - - \[%{HTTPDATE:event_date}\] %{QS:first} %{NUMBER:response} %{NUMBER:bytes} %{QS:destination} %{QS:browser}"]
}
}
}
filebeat 配置:
filebeat:
prospectors:
-
paths:
- /var/log/apache2/access.log
document_type: apache_access
registry_file: /var/lib/filebeat/registry
我还使用了来自 logz.io 的示例日志文件,它包含如下日志:
88.114.162.149 - - [04/Aug/2016:00:00:05 +0000] "GET /item/giftcards/3802 HTTP/1.1" 200 82 "/category/books" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:9.0.1) Gecko/20100101 Firefox/9.0.1"
156.141.192.36 - - [04/Aug/2016:00:00:10 +0000] "GET /category/toys?from=20 HTTP/1.1" 200 135 "/category/toys" "Mozilla/5.0 (Windows NT 6.0) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.56 Safari/535.11"
92.213.110.215 - - [04/Aug/2016:00:00:15 +0000] "GET /category/software HTTP/1.1" 200 108 "/category/books" "Mozilla/5.0 (Windows NT 6.0) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.56 Safari/535.11"
80.225.119.24 - - [04/Aug/2016:00:00:20 +0000] "GET /category/cameras HTTP/1.1" 200 100 "http://www.google.com/search?ie=UTF-8&q=google&sclient=psy-ab&q=Cameras+Books&oq=Cameras+Books&aq=f&aqi=g-vL1&aql=&pbx=1&bav=on.2,or.r_gc.r_pw.r_qf.,cf.osb&biw=2640&bih=427" "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; YTB730; GTB7.2; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; .NET4.0C; .NET4.0E; Media Center PC 6.0)"
208.219.150.176 - - [04/Aug/2016:00:00:25 +0000] "GET /category/software HTTP/1.1" 200 117 "-" "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; GTB7.2; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C)"
160.165.186.172 - - [04/Aug/2016:00:00:30 +0000] "GET /category/office HTTP/1.1" 200 101 "/category/electronics" "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; YTB720; GTB7.2; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729)"
224.150.219.97 - - [04/Aug/2016:00:00:35 +0000] "GET /category/jewelry HTTP/1.1" 200 74 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
我已经在 grokdebug 中检查了我的过滤器,那里一切正常,但每次我将这些日志推送到 logstash 时,它都不会应用该过滤器,而是所有日志条目都有一个“_grokparsefailure”标签。
知道这里的问题是什么吗?我已经遵循了几个指南,但仍然有这个问题。
P.S。
我知道 COMBINEDAPACHELOG,但我仍然想以自己的经验和理解 ELK stack batter 的方式解析它。
尝试在你的 grok 匹配中将 messege
更改为 message
change 'e' to 'a'
|
v
match => ["message", "%{IP:client} - - \[%{HTTPDATE:event_date}\] %{QS:first} %{NUMBER:response} %{NUMBER:bytes} %{QS:destination} %{QS:browser}"]
我正在尝试使用 ELK 堆栈解析一些 Apache 访问日志,但我遇到了 logstash 的问题,没有应用我在任何 Apache 日志上创建的 Apache 过滤器。 这是我的过滤器文件:
filter {
if [type] == "apache_access" {
grok {
patterns_dir => ["/opt/logstash/patterns/apache"]
add_tag => ["grokked", "apache"]
match => ["messege", "%{IP:client} - - \[%{HTTPDATE:event_date}\] %{QS:first} %{NUMBER:response} %{NUMBER:bytes} %{QS:destination} %{QS:browser}"]
}
}
}
filebeat 配置:
filebeat:
prospectors:
-
paths:
- /var/log/apache2/access.log
document_type: apache_access
registry_file: /var/lib/filebeat/registry
我还使用了来自 logz.io 的示例日志文件,它包含如下日志:
88.114.162.149 - - [04/Aug/2016:00:00:05 +0000] "GET /item/giftcards/3802 HTTP/1.1" 200 82 "/category/books" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:9.0.1) Gecko/20100101 Firefox/9.0.1"
156.141.192.36 - - [04/Aug/2016:00:00:10 +0000] "GET /category/toys?from=20 HTTP/1.1" 200 135 "/category/toys" "Mozilla/5.0 (Windows NT 6.0) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.56 Safari/535.11"
92.213.110.215 - - [04/Aug/2016:00:00:15 +0000] "GET /category/software HTTP/1.1" 200 108 "/category/books" "Mozilla/5.0 (Windows NT 6.0) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.56 Safari/535.11"
80.225.119.24 - - [04/Aug/2016:00:00:20 +0000] "GET /category/cameras HTTP/1.1" 200 100 "http://www.google.com/search?ie=UTF-8&q=google&sclient=psy-ab&q=Cameras+Books&oq=Cameras+Books&aq=f&aqi=g-vL1&aql=&pbx=1&bav=on.2,or.r_gc.r_pw.r_qf.,cf.osb&biw=2640&bih=427" "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; YTB730; GTB7.2; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; .NET4.0C; .NET4.0E; Media Center PC 6.0)"
208.219.150.176 - - [04/Aug/2016:00:00:25 +0000] "GET /category/software HTTP/1.1" 200 117 "-" "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; GTB7.2; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C)"
160.165.186.172 - - [04/Aug/2016:00:00:30 +0000] "GET /category/office HTTP/1.1" 200 101 "/category/electronics" "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; YTB720; GTB7.2; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729)"
224.150.219.97 - - [04/Aug/2016:00:00:35 +0000] "GET /category/jewelry HTTP/1.1" 200 74 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
我已经在 grokdebug 中检查了我的过滤器,那里一切正常,但每次我将这些日志推送到 logstash 时,它都不会应用该过滤器,而是所有日志条目都有一个“_grokparsefailure”标签。
知道这里的问题是什么吗?我已经遵循了几个指南,但仍然有这个问题。
P.S。 我知道 COMBINEDAPACHELOG,但我仍然想以自己的经验和理解 ELK stack batter 的方式解析它。
尝试在你的 grok 匹配中将 messege
更改为 message
change 'e' to 'a'
|
v
match => ["message", "%{IP:client} - - \[%{HTTPDATE:event_date}\] %{QS:first} %{NUMBER:response} %{NUMBER:bytes} %{QS:destination} %{QS:browser}"]