Logstash 不写入 elasticsearch (_discover_file_glob)

Logstash not writing to elasticsearch (_discover_file_glob)

嘿,我按照这个教程使用 kibana 为 elasticsearch 设置了 jmeter 和 logstash:http://ecmarchitect.com/archives/2014/09/09/3932

第一次一切正常。 logstash 使用我的 jmeter 数据创建并填充了一个新的 jmeter-results 索引。 今天我对新的 jmeter 数据进行了同样的尝试,但没有任何反应。 没有发生错误,但在 logstash 日志中我可以看到 _discover_file_glob 被一遍又一遍地登录 again.Here 是我日志的重要部分:

 Registering file input {:path=>["/etc/apache-jmeter-2.12/bin/log2.jtl"], :level=>:info, :file=>"logstash/inputs/file.rb", :line=>"74"}
No sincedb_path set, generating one based on the file path {:sincedb_path=>"/root/.sincedb_66c8ea3a6e5fbda3879299a795b893d5", :path=>["/etc/apache-jmeter-2.12/bin/log2.jtl"], :level=>:info, :file=>"logstash/inputs/file.rb", :line=>"115"}
Pipeline started {:level=>:info, :file=>"logstash/pipeline.rb", :line=>"78"}
_sincedb_open: reading from /root/.sincedb_66c8ea3a6e5fbda3879299a795b893d5 {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"199"}
_sincedb_open: setting [33297239, 0, 2306] to 44106 {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"203"}
_discover_file_glob: /etc/apache-jmeter-2.12/bin/log2.jtl: glob is: ["/etc/apache-jmeter-2.12/bin/log2.jtl"] {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"117"}
_discover_file: /etc/apache-jmeter-2.12/bin/log2.jtl: new: /etc/apache-jmeter-2.12/bin/log2.jtl (exclude is []) {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"126"}
_open_file: /etc/apache-jmeter-2.12/bin/log2.jtl: opening {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"98"}
/etc/apache-jmeter-2.12/bin/log2.jtl: sincedb last value 44106, cur size 44106 {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"122"}
/etc/apache-jmeter-2.12/bin/log2.jtl: sincedb: seeking to 44106 {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"124"}
writing sincedb (delta since last write = 1421612560) {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"177"}
/etc/apache-jmeter-2.12/bin/log2.jtl: file grew, old size 0, new size 44106 {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"81"}
Automatic template management enabled {:manage_template=>"true", :level=>:info, :file=>"logstash/outputs/elasticsearch_http.rb", :line=>"104"}
Template Search URL: {:template_search_url=>"http://localhost:9200/_template/*", :level=>:debug, :file=>"logstash/outputs/elasticsearch_http.rb", :line=>"112"}
_discover_file_glob: /etc/apache-jmeter-2.12/bin/log2.jtl: glob is: ["/etc/apache-jmeter-2.12/bin/log2.jtl"] {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"117"}
_discover_file_glob: /etc/apache-jmeter-2.12/bin/log2.jtl: glob is: ["/etc/apache-jmeter-2.12/bin/log2.jtl"] {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"117"}
_discover_file_glob: /etc/apache-jmeter-2.12/bin/log2.jtl: glob is: ["/etc/apache-jmeter-2.12/bin/log2.jtl"] {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"117"}
_discover_file_glob: /etc/apache-jmeter-2.12/bin/log2.jtl: glob is: ["/etc/apache-jmeter-2.12/bin/log2.jtl"] {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"117"}

我在网上看到解决方法是删除 .sincedb_ 个文件,但仍然没有任何反应。

也许有人可以帮助我?

logStashinput section 中将 start_position 设置为 beginning 以开始 处理 相同的 CSV 文件 再次:

input {

  file {
    path => [ "/CSV_File.csv"]
    type => "JMeterlog"
    start_position => "beginning"
    sincedb_path => "/dev/null"

 }
}
filter {
if ([message] =~ "responseCode") {
        drop { }
} 
else { 
        csv { columns => ["timeStamp", "elapsed", "label", "responseCode", "responseMessage", "threadName", "dataType", "success", "bytes", "grpThreads", "allThreads", "URL", "Latency", "SampleCount", "ErrorCount", "IdleTime"]}
     }
}
output {
  stdout { codec => rubydebug }

  elasticsearch {
    hosts => ["localhost:9200"]
    index => "logstash-jmeter-results-%{+YYYY.MM.dd}"
    template => "jmeter-results-mapping.json"
    template_name => "logstash-jmeter-results"
    template_overwrite => false
    }
}