logstash 输出未显示在 kibana 中
logstash output not showing up in kibana
我刚开始学习 Elastic Search 并尝试通过 logstash 将 IIS 日志转储到 ES,看看它在 Kibana 中的样子。
已成功设置所有 3 个代理,并且它们 运行 没有错误。但是当我 运行 logstash 在我存储的日志文件上时,日志不会显示在 Kibana 中。
(我正在使用没有 'head' _plugin 的 ES5.0)
这是我在 logstash 命令中看到的输出。
Sending Logstash logs to C:/elasticsearch-5.0.0/logstash-5.0.0-rc1/logs which is now configured via log4j2.properties.
06:28:26.067 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>["http://localhost:9200"]}}
06:28:26.081 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Using mapping template from {:path=>nil}
06:28:26.501 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
06:28:26.573 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["localhost:9200"]}
06:28:26.717 [[main]-pipeline-manager] INFO logstash.pipeline - Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
06:28:26.736 [[main]-pipeline-manager] INFO logstash.pipeline - Pipeline main started
06:28:26.857 [Api Webserver] INFO logstash.agent - Successfully started Logstash API endpoint {:port=>9600}
但是 kibana 没有显示任何索引。我是这里的新手,不确定内部发生了什么。你能帮我理解这里出了什么问题吗?
Logstash 配置文件:
input {
file {
type => "iis-w3c"
path => "C:/Users/ras/Desktop/logs/logs/LogFiles/test/aug1/*.log"
}
}
filter {
grok {
match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{WORD:serviceName} %{WORD:serverName} %{IP:serverIP} %{WORD:method} %{URIPATH:uriStem} %{NOTSPACE:uriQuery} %{NUMBER:port} %{NOTSPACE:username} %{IPORHOST:clientIP} %{NOTSPACE:protocolVersion} %{NOTSPACE:userAgent} %{NOTSPACE:cookie} %{NOTSPACE:referer} %{NOTSPACE:requestHost} %{NUMBER:response} %{NUMBER:subresponse} %{NUMBER:win32response} %{NUMBER:bytesSent} %{NUMBER:bytesReceived} %{NUMBER:timetaken}"]
}
mutate {
## Convert some fields from strings to integers
#
convert => ["bytesSent", "integer"]
convert => ["bytesReceived", "integer"]
convert => ["timetaken", "integer"]
## Create a new field for the reverse DNS lookup below
#
add_field => { "clientHostname" => "%{clientIP}" }
## Finally remove the original log_timestamp field since the event will
# have the proper date on it
#
remove_field => [ "log_timestamp"]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "%{type}-%{+YYYY.MM}"
}
stdout { codec => rubydebug }
}
您可以使用 kopf 等插件或端点 _cat/indices/
检查 Elasticsearch 中存在的索引名称,您可以通过浏览器 [ip of ES]:9200/_cat/indices
或通过 curl 直接访问它: curl [ip of ES]:9200/_cat/indices
.
使用 Kibana,您必须提供索引名称的模式,默认情况下为 logstash-*
,如您的屏幕截图所示。这个默认值在 Kibana 中使用,因为在 logstash 的 elasticsearch 输出插件中,默认索引模式是 logstash-%{+YYYY.MM.dd}
(cf doc),它将用于命名用这个插件创建的索引。
但在您的情况下,插件配置为 index => "%{type}-%{+YYYY.MM}"
。因此创建的索引将是 iis-w3c-%{+YYYY.MM}
格式。因此,您必须在字段 Index name or pattern
中将 logstash-*
替换为 iis-w3c-*
我刚开始学习 Elastic Search 并尝试通过 logstash 将 IIS 日志转储到 ES,看看它在 Kibana 中的样子。 已成功设置所有 3 个代理,并且它们 运行 没有错误。但是当我 运行 logstash 在我存储的日志文件上时,日志不会显示在 Kibana 中。 (我正在使用没有 'head' _plugin 的 ES5.0)
这是我在 logstash 命令中看到的输出。
Sending Logstash logs to C:/elasticsearch-5.0.0/logstash-5.0.0-rc1/logs which is now configured via log4j2.properties.
06:28:26.067 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>["http://localhost:9200"]}}
06:28:26.081 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Using mapping template from {:path=>nil}
06:28:26.501 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
06:28:26.573 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["localhost:9200"]}
06:28:26.717 [[main]-pipeline-manager] INFO logstash.pipeline - Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
06:28:26.736 [[main]-pipeline-manager] INFO logstash.pipeline - Pipeline main started
06:28:26.857 [Api Webserver] INFO logstash.agent - Successfully started Logstash API endpoint {:port=>9600}
但是 kibana 没有显示任何索引。我是这里的新手,不确定内部发生了什么。你能帮我理解这里出了什么问题吗?
Logstash 配置文件:
input {
file {
type => "iis-w3c"
path => "C:/Users/ras/Desktop/logs/logs/LogFiles/test/aug1/*.log"
}
}
filter {
grok {
match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{WORD:serviceName} %{WORD:serverName} %{IP:serverIP} %{WORD:method} %{URIPATH:uriStem} %{NOTSPACE:uriQuery} %{NUMBER:port} %{NOTSPACE:username} %{IPORHOST:clientIP} %{NOTSPACE:protocolVersion} %{NOTSPACE:userAgent} %{NOTSPACE:cookie} %{NOTSPACE:referer} %{NOTSPACE:requestHost} %{NUMBER:response} %{NUMBER:subresponse} %{NUMBER:win32response} %{NUMBER:bytesSent} %{NUMBER:bytesReceived} %{NUMBER:timetaken}"]
}
mutate {
## Convert some fields from strings to integers
#
convert => ["bytesSent", "integer"]
convert => ["bytesReceived", "integer"]
convert => ["timetaken", "integer"]
## Create a new field for the reverse DNS lookup below
#
add_field => { "clientHostname" => "%{clientIP}" }
## Finally remove the original log_timestamp field since the event will
# have the proper date on it
#
remove_field => [ "log_timestamp"]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "%{type}-%{+YYYY.MM}"
}
stdout { codec => rubydebug }
}
您可以使用 kopf 等插件或端点 _cat/indices/
检查 Elasticsearch 中存在的索引名称,您可以通过浏览器 [ip of ES]:9200/_cat/indices
或通过 curl 直接访问它: curl [ip of ES]:9200/_cat/indices
.
使用 Kibana,您必须提供索引名称的模式,默认情况下为 logstash-*
,如您的屏幕截图所示。这个默认值在 Kibana 中使用,因为在 logstash 的 elasticsearch 输出插件中,默认索引模式是 logstash-%{+YYYY.MM.dd}
(cf doc),它将用于命名用这个插件创建的索引。
但在您的情况下,插件配置为 index => "%{type}-%{+YYYY.MM}"
。因此创建的索引将是 iis-w3c-%{+YYYY.MM}
格式。因此,您必须在字段 Index name or pattern
logstash-*
替换为 iis-w3c-*