缺少没有默认值的必需配置 "key.converter"

Missing required configuration "key.converter" which has no defau

当我尝试启动 Kafka connect for elastic search reactor 时,在独立模式下我收到以下错误:

Exception in thread "main" org.apache.kafka.common.config.ConfigException: Missing required configuration "key.converter" which has no default value.
        at org.apache.kafka.common.config.ConfigDef.parseValue(ConfigDef.java:463)
        at org.apache.kafka.common.config.ConfigDef.parse(ConfigDef.java:453)
        at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:62)
        at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:75)
        at org.apache.kafka.connect.runtime.WorkerConfig.<init>(WorkerConfig.java:218)
        at org.apache.kafka.connect.runtime.distributed.DistributedConfig.<init>(DistributedConfig.java:272)
        at org.apache.kafka.connect.cli.ConnectDistributed.main(ConnectDistributed.java:72)

我可以解决这个错误吗?

编辑 2018 年 1 月 5 日 对不起,我试着说得更具体些。我使用流反应器连接器: https://github.com/Landoop/stream-reactor 这是我从 EC2 实例启动的命令,其中有我的 kafka 的唯一代理:

./bin/connect-standalone.sh config/elastic-config.properties config/connect- 
standalone.properties.

为了连接-standalone.properties:

# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#    http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# These are defaults. This file just demonstrates how to override some 
settings.
bootstrap.servers=localhost:9092

# The converters specify the format of data in Kafka and how to translate it 
into Connect data. Every Connect user will
# need to configure these based on the format they want their data in when 
loaded from or stored into Kafka
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
# Converter-specific settings can be passed in by prefixing the Converter's 
setting with the converter we want to apply
# it to
key.converter.schemas.enable=true
value.converter.schemas.enable=true

# The internal converter used for offsets and config data is configurable 
and must be specified, but most users will
# always want to use the built-in default. Offset and config data is never 
visible outside of Copcyat in this format.
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false

offset.storage.file.filename=/tmp/connect.offsets
# Flush much faster than normal, which is useful for testing/debugging
offset.flush.interval.ms=10000
plugin.path=/home/ubuntu/kafka_2.11-1.0.1/libs

这是另一个文件:

name=elasticsearch-sink
    connector.class=io.confluent.connect.elasticsearch.ElasticsearchSinkConnector
    tasks.max=1
    topics=test
    topic.index.map=test:test_index
    connection.url=myurl
    type.name=log
    key.ignore=true
    schema.ignore=true

错误有点说明了一切。您缺少 key.converter 所需的配置条目。这告诉 Kafka Connect 如何反序列化 Kafka 主题上的数据(通常是 JSON 或 Avro)。

您可以在 this gist 中查看 Elasticsearch 的有效连接器配置示例。如果您更新问题以包括您正在使用的配置,我可以指出如何合并它。


查看您的配置后,错误的原因是您以错误的顺序调用配置文件的连接,因此连接无法找到它期望的配置。

应该是:

./bin/connect-standalone.sh config/connect-standalone.properties config/elastic-config.properties

阅读 this article 中有关从 Kafka 流式传输到 Elasticsearch 的更多信息,以及关于使用 Kafka Connect 的一般系列文章: