Docker: 无法将数据从 logstash 容器发送到 Kafka 容器
Docker: cant send data from logstash container to Kafka container
我有 2 个 docker 容器,1 个 运行ning Logstash 和另一个 运行ning Zookeeper 和 Kafka。我正在尝试将数据从 Logstash 发送到 Kafka,但似乎无法将数据传送到我在 Kafka 中的主题。
我可以登录 Docker Kafka 容器并从终端向我的主题生成一条消息,然后也使用它。
我正在使用输出kafka插件:
output {
kafka {
topic_id => "MyTopicName"
broker_list => "kafkaIPAddress:9092"
}
}
我从运行ningdocker inspect kafka2
得到的ip地址
当我 运行 ./bin/logstash agent --config /etc/logstash/conf.d/01-input.conf
我得到这个错误。
Settings: Default pipeline workers: 4
Unknown setting 'broker_list' for kafka {:level=>:error}
Pipeline aborted due to error {:exception=>#<LogStash::ConfigurationError: Something is wrong with your configuration.>, :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/config/mixin.rb:134:in `config_init'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/outputs/base.rb:63:in `initialize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/output_delegator.rb:74:in `register'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:181:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:181:in `start_workers'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:136:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/agent.rb:473:in `start_pipeline'"], :level=>:error}
stopping pipeline {:id=>"main"}
我已通过 运行以下返回 OK 的命令检查了文件的配置。
./bin/logstash agent --configtest --config /etc/logstash/conf.d/01-input.conf
Configuration OK
有没有人遇到过这个问题,是我没有打开 kafka 容器上的端口吗?如果是的话,我该如何在保持 Kafka 运行ning 的同时做到这一点?
错误在这里broker_list => "kafkaIPAddress:9092"
尝试bootstrap_servers => "KafkaIPAddress:9092"
如果你在不同的机器上有容器,将 kafka 映射到主机 9092
并使用主机 address:port,如果在同一主机上使用内部 Docker IP:port
我有 2 个 docker 容器,1 个 运行ning Logstash 和另一个 运行ning Zookeeper 和 Kafka。我正在尝试将数据从 Logstash 发送到 Kafka,但似乎无法将数据传送到我在 Kafka 中的主题。
我可以登录 Docker Kafka 容器并从终端向我的主题生成一条消息,然后也使用它。
我正在使用输出kafka插件:
output {
kafka {
topic_id => "MyTopicName"
broker_list => "kafkaIPAddress:9092"
}
}
我从运行ningdocker inspect kafka2
当我 运行 ./bin/logstash agent --config /etc/logstash/conf.d/01-input.conf
我得到这个错误。
Settings: Default pipeline workers: 4
Unknown setting 'broker_list' for kafka {:level=>:error}
Pipeline aborted due to error {:exception=>#<LogStash::ConfigurationError: Something is wrong with your configuration.>, :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/config/mixin.rb:134:in `config_init'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/outputs/base.rb:63:in `initialize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/output_delegator.rb:74:in `register'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:181:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:181:in `start_workers'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:136:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/agent.rb:473:in `start_pipeline'"], :level=>:error}
stopping pipeline {:id=>"main"}
我已通过 运行以下返回 OK 的命令检查了文件的配置。
./bin/logstash agent --configtest --config /etc/logstash/conf.d/01-input.conf
Configuration OK
有没有人遇到过这个问题,是我没有打开 kafka 容器上的端口吗?如果是的话,我该如何在保持 Kafka 运行ning 的同时做到这一点?
错误在这里broker_list => "kafkaIPAddress:9092"
尝试bootstrap_servers => "KafkaIPAddress:9092"
如果你在不同的机器上有容器,将 kafka 映射到主机 9092
并使用主机 address:port,如果在同一主机上使用内部 Docker IP:port