MongoDb Debezium - "Connector config contains no connector type"
MongoDb Debezium - "Connector config contains no connector type"
我正在尝试为 Kafka 和 debezium 做一个 POC。
我已经启动了 kafka 和 zookeeper,它们正在工作......现在当我尝试加载 kafka-connect 时(我对此有点陌生......)我得到了这个我无法理解的错误我做错了什么。
注意:我已经使用 Debezium 教程 docker 图像测试了所有这些,但我想从远程服务器连接,我认为如果没有安装一切会更容易docker玩配置
使用以下命令启动连接
./connect-standalone.sh ~/kafka/config/connect-standalone.properties ~/kafka/config/connect-standalone-worker.properties ~/kafka/config/debezium-connector.properties
连接-standalone.properties
bootstrap.servers=localhost:9092
key.converter.schemas.enable=true
value.converter.schemas.enable=true
offset.flush.interval.ms=10000
plugin.path=/home/ubuntu/kafka/plugins
独立连接-worker.properties
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
offset.storage.file.filename=/home/user/offest
value.converter=org.apache.kafka.connect.json.JsonConverter
key.converter=org.apache.kafka.connect.json.JsonConverter
debezium-connector.properties
name=my-connector
connector.class=io.debezium.connector.mongodb.MongoDbConnector
include.schema.changes=false
mongodb.name=mymongo
collection.whitelist=my.collection
tasks.max=1
mongodb.hosts=A.B.C.D:27017
我在 运行 连接时得到以下信息:
[2018-12-27 15:31:41,995] ERROR Failed to create job for /home/ubuntu/kafka/config/connect-standalone-worker.properties (org.apache.kafka.connect.cli.ConnectStandalone:102)
[2018-12-27 15:31:41,996] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:113)
java.util.concurrent.ExecutionException: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector config {internal.key.converter=org.apache.kafka.connect.json.JsonConverter, offset.storage.file.filename=/home/user/offest, internal.value.converter.schemas.enable=false, internal.value.converter=org.apache.kafka.connect.json.JsonConverter, value.converter=org.apache.kafka.connect.json.JsonConverter, internal.key.converter.schemas.enable=false, key.converter=org.apache.kafka.connect.json.JsonConverter} contains no connector type
at org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:79)
at org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:66)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:110)
Caused by: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector config {internal.key.converter=org.apache.kafka.connect.json.JsonConverter, offset.storage.file.filename=/home/user/offest, internal.value.converter.schemas.enable=false, internal.value.converter=org.apache.kafka.connect.json.JsonConverter, value.converter=org.apache.kafka.connect.json.JsonConverter, internal.key.converter.schemas.enable=false, key.converter=org.apache.kafka.connect.json.JsonConverter} contains no connector type
at org.apache.kafka.connect.runtime.AbstractHerder.validateConnectorConfig(AbstractHerder.java:259)
at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.putConnectorConfig(StandaloneHerder.java:189)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:107)
[2018-12-27 15:31:41,997] INFO Kafka Connect stopping (org.apache.kafka.connect.runtime.Connect:65)
connect-standalone.properties
和 connect-standalone-worker.properties
需要是一个文件。
错误是说 connect-standalone-worker.properties
没有 connector.class
值(它不应该是因为它是 worker 属性,而不是连接器)
您尝试 运行 的命令应该类似于
connect-standalone worker.properties connector1.properties [connector2.properties ... ]
我正在尝试为 Kafka 和 debezium 做一个 POC。
我已经启动了 kafka 和 zookeeper,它们正在工作......现在当我尝试加载 kafka-connect 时(我对此有点陌生......)我得到了这个我无法理解的错误我做错了什么。
注意:我已经使用 Debezium 教程 docker 图像测试了所有这些,但我想从远程服务器连接,我认为如果没有安装一切会更容易docker玩配置
使用以下命令启动连接
./connect-standalone.sh ~/kafka/config/connect-standalone.properties ~/kafka/config/connect-standalone-worker.properties ~/kafka/config/debezium-connector.properties
连接-standalone.properties
bootstrap.servers=localhost:9092 key.converter.schemas.enable=true value.converter.schemas.enable=true offset.flush.interval.ms=10000 plugin.path=/home/ubuntu/kafka/plugins
独立连接-worker.properties
internal.key.converter=org.apache.kafka.connect.json.JsonConverter internal.value.converter=org.apache.kafka.connect.json.JsonConverter internal.key.converter.schemas.enable=false internal.value.converter.schemas.enable=false offset.storage.file.filename=/home/user/offest value.converter=org.apache.kafka.connect.json.JsonConverter key.converter=org.apache.kafka.connect.json.JsonConverter
debezium-connector.properties
name=my-connector connector.class=io.debezium.connector.mongodb.MongoDbConnector include.schema.changes=false mongodb.name=mymongo collection.whitelist=my.collection tasks.max=1 mongodb.hosts=A.B.C.D:27017
我在 运行 连接时得到以下信息:
[2018-12-27 15:31:41,995] ERROR Failed to create job for /home/ubuntu/kafka/config/connect-standalone-worker.properties (org.apache.kafka.connect.cli.ConnectStandalone:102) [2018-12-27 15:31:41,996] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:113) java.util.concurrent.ExecutionException: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector config {internal.key.converter=org.apache.kafka.connect.json.JsonConverter, offset.storage.file.filename=/home/user/offest, internal.value.converter.schemas.enable=false, internal.value.converter=org.apache.kafka.connect.json.JsonConverter, value.converter=org.apache.kafka.connect.json.JsonConverter, internal.key.converter.schemas.enable=false, key.converter=org.apache.kafka.connect.json.JsonConverter} contains no connector type at org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:79) at org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:66) at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:110) Caused by: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector config {internal.key.converter=org.apache.kafka.connect.json.JsonConverter, offset.storage.file.filename=/home/user/offest, internal.value.converter.schemas.enable=false, internal.value.converter=org.apache.kafka.connect.json.JsonConverter, value.converter=org.apache.kafka.connect.json.JsonConverter, internal.key.converter.schemas.enable=false, key.converter=org.apache.kafka.connect.json.JsonConverter} contains no connector type at org.apache.kafka.connect.runtime.AbstractHerder.validateConnectorConfig(AbstractHerder.java:259) at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.putConnectorConfig(StandaloneHerder.java:189) at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:107) [2018-12-27 15:31:41,997] INFO Kafka Connect stopping (org.apache.kafka.connect.runtime.Connect:65)
connect-standalone.properties
和 connect-standalone-worker.properties
需要是一个文件。
错误是说 connect-standalone-worker.properties
没有 connector.class
值(它不应该是因为它是 worker 属性,而不是连接器)
您尝试 运行 的命令应该类似于
connect-standalone worker.properties connector1.properties [connector2.properties ... ]