Logstash:来自 json 的 geopip
Logstash : geopip from json
我正在尝试对 rails 应用程序中的请求进行地理定位。我已将 Lograge 配置为在 json.
中生成我的日志
我认为 logstash 无法从 json 检索 remote_ip 并处理 geoip。
这是解码后的 json,Kibana 中的 geoip 字段为空:
{
"_index": "logstash-2016.03.15",
"_type": "rails logs",
"_id": "AVN6t1-FkghE9kQv20fc",
"_score": null,
"_source": {
"@version": "1",
"@timestamp": "2016-03-15T14:39:10.176Z",
"client": {
"host": "www.myapp.com",
"remote_ip": "\"xx.xx.xx.xxx\"",
"user_agent": "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/48.0.2564.116 Safari/537.36",
"browser": "Chrome",
"browser_version": "48.0.2564.116",
"plateform": "windows"
},
"geoip": {}
},
"fields": {
"@timestamp": [
1458052750176
]
},
"sort": [
1458052750176
]
}
这是我的logstash.conf
input {
file {
type => "rails logs"
# * is for indexing rotated logs
path => "/var/www/myapp/shared/log/production.log*"
}
}
filter {
grok {
match => [
"message",
"%{DATA:data}%{LOGLEVEL:loglevel} -- : %{GREEDYDATA:json}({({[^}]+},?\s*)*})?\s*$(?<stacktrace>(?m:.*))?"
]
remove_field => ["message"]
}
json {
source => "json"
remove_field => ["json"]
}
geoip {
source => "[client][remote_ip]"
target => "geoip"
database => "/etc/logstash/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
}
}
output {
elasticsearch {
}
}
我是否遗漏了配置中的某些内容?
提前致谢。
看起来 "remote_ip" 没有被正确解析,它有双引号。
我猜 geoip 过滤器不起作用,因为它不将 remote_ip 作为 ip.
进行管理
我正在尝试对 rails 应用程序中的请求进行地理定位。我已将 Lograge 配置为在 json.
中生成我的日志我认为 logstash 无法从 json 检索 remote_ip 并处理 geoip。
这是解码后的 json,Kibana 中的 geoip 字段为空:
{
"_index": "logstash-2016.03.15",
"_type": "rails logs",
"_id": "AVN6t1-FkghE9kQv20fc",
"_score": null,
"_source": {
"@version": "1",
"@timestamp": "2016-03-15T14:39:10.176Z",
"client": {
"host": "www.myapp.com",
"remote_ip": "\"xx.xx.xx.xxx\"",
"user_agent": "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/48.0.2564.116 Safari/537.36",
"browser": "Chrome",
"browser_version": "48.0.2564.116",
"plateform": "windows"
},
"geoip": {}
},
"fields": {
"@timestamp": [
1458052750176
]
},
"sort": [
1458052750176
]
}
这是我的logstash.conf
input {
file {
type => "rails logs"
# * is for indexing rotated logs
path => "/var/www/myapp/shared/log/production.log*"
}
}
filter {
grok {
match => [
"message",
"%{DATA:data}%{LOGLEVEL:loglevel} -- : %{GREEDYDATA:json}({({[^}]+},?\s*)*})?\s*$(?<stacktrace>(?m:.*))?"
]
remove_field => ["message"]
}
json {
source => "json"
remove_field => ["json"]
}
geoip {
source => "[client][remote_ip]"
target => "geoip"
database => "/etc/logstash/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
}
}
output {
elasticsearch {
}
}
我是否遗漏了配置中的某些内容? 提前致谢。
看起来 "remote_ip" 没有被正确解析,它有双引号。 我猜 geoip 过滤器不起作用,因为它不将 remote_ip 作为 ip.
进行管理