无法使用 logstash 在 elasticsearch 上创建索引

Not able to create index on elasticsearch using logstash

我正在将 elasticsearch 与 logstash 结合使用,以通过 kibana 可视化数据集。

我根据规范创建了一个配置文件,启动了 elasticsearch 和 kibana,运行 然后加载了配置文件。当我完成加载文件时,我收到以下消息。我知道文件没有加载,因为我会在提示中看到数据集。我还在 kibana 仪表板中搜索了索引,但它没有显示索引。

下面是我得到的提示信息:

Thread.exclusive is deprecated, use Thread::Mutex
Sending Logstash logs to D:/AMS/Softwares/logstash-7.4.2/logs which is now configured via log4j2.properties
[2019-11-25T14:05:08,406][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-11-25T14:05:08,429][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.4.2"}
[2019-11-25T14:05:10,623][INFO ][org.reflections.Reflections] Reflections took 42 ms to scan 1 urls, producing 20 keys and 40 values
[2019-11-25T14:05:11,458][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"Imports", id=>"7195e24081a8419104011573b541ec87e57c49b74d413cad474911f90ee68a82", hosts=>[//localhost], document_type=>"Imports20162017", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_9e38ae5e-1762-4e76-9a15-bf2e71e368a8", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_rollover_alias=>"logstash", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2019-11-25T14:05:12,021][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-11-25T14:05:12,260][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-11-25T14:05:12,322][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2019-11-25T14:05:12,327][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2019-11-25T14:05:12,375][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2019-11-25T14:05:12,450][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2019-11-25T14:05:12,549][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-11-25T14:05:12,565][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2019-11-25T14:05:12,572][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x3f750db6 run>"}
[2019-11-25T14:05:15,423][ERROR][logstash.javapipeline    ][main] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<ArgumentError: File paths must be absolute, relative path specified: D:\AMS\Docs\ELK Related\Data Set\import-and-export-by-india\PC_Import_2016_2017.csv>, :backtrace=>["D:/AMS/Softwares/logstash-7.4.2/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.11/lib/logstash/inputs/file.rb:269:in `block in register'", "org/jruby/RubyArray.java:1800:in `each'", "D:/AMS/Softwares/logstash-7.4.2/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.11/lib/logstash/inputs/file.rb:267:in `register'", "D:/AMS/Softwares/logstash-7.4.2/logstash-core/lib/logstash/java_pipeline.rb:195:in `block in register_plugins'", "org/jruby/RubyArray.java:1800:in `each'", "D:/AMS/Softwares/logstash-7.4.2/logstash-core/lib/logstash/java_pipeline.rb:194:in `register_plugins'", "D:/AMS/Softwares/logstash-7.4.2/logstash-core/lib/logstash/java_pipeline.rb:296:in `start_inputs'", "D:/AMS/Softwares/logstash-7.4.2/logstash-core/lib/logstash/java_pipeline.rb:252:in `start_workers'", "D:/AMS/Softwares/logstash-7.4.2/logstash-core/lib/logstash/java_pipeline.rb:149:in `run'", "D:/AMS/Softwares/logstash-7.4.2/logstash-core/lib/logstash/java_pipeline.rb:108:in `block in start'"], :thread=>"#<Thread:0x3f750db6 run>"}
[2019-11-25T14:05:15,452][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2019-11-25T14:05:15,848][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2019-11-25T14:05:20,768][INFO ][logstash.runner          ] Logstash shut down.

下面是我正在使用的 logstash 文件:

input {
  file {
    path => "D:\AMS\Docs\ELK Related\Data Set\import-and-export-by-india\PC_Import_2016_2017.csv"
    start_position => "beginning"
    sincedb_path => "nul"
  }
}

filter {
  csv {
    separator => ","
    columns => [ "pc_code", "pc_description", "unit", "country_code", "country_name", "quantity", "value" ]
  }
}

output {
  elasticsearch {
    hosts => "localhost"
    index => "Imports"
    document_type => "Imports20162017"
  }
}

我做错了什么吗?

已解决

问题是文件路径中使用了反斜杠。我改用正斜杠,效果很好。

要在 Kibana 中可视化和探索数据,您必须创建索引模式。索引模式告诉 Kibana 哪些 Elasticsearch 索引包含您要使用的数据。索引模式可以匹配单个索引、多个索引和汇总索引。

转到管理>>索引模式>>+创建索引模式

.

完成此操作后,您应该能够可视化您的文档输入。

在此处获取更多详细信息Index patterns

问题是这个错误:

File paths must be absolute, relative path specified

您需要去掉路径中的反斜杠,改用正斜杠,如下所示:

input {
  file {
    path => "D:/AMS/Docs/ELK Related/Data Set/import-and-export-by-india/PC_Import_2016_2017.csv"
    start_position => "beginning"
    sincedb_path => "nul"
  }
}