org.apache.kafka.connect.runtime.rest.errors.BadRequestException
org.apache.kafka.connect.runtime.rest.errors.BadRequestException
我正在尝试编写一个 kafka 连接器以将 kafka 主题中的数据移动到 mongodb(sink)。因为我在 connect-json-standalone.properties 文件以及 kafka 文件夹中的 connect-mongo-sink.properties 文件中添加了所需的配置。在启动连接器的过程中,我遇到异常
[2019-07-23 18:07:17,274] INFO Started o.e.j.s.ServletContextHandler@76e3b45b{/,null,AVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:855)
[2019-07-23 18:07:17,274] INFO REST resources initialized; server is started and ready to handle requests (org.apache.kafka.connect.runtime.rest.RestServer:231)
[2019-07-23 18:07:17,274] INFO Kafka Connect started (org.apache.kafka.connect.runtime.Connect:56)
[2019-07-23 18:07:17,635] INFO Cluster created with settings {hosts=[localhost:27017], mode=MULTIPLE, requiredClusterType=UNKNOWN, serverSelectionTimeout='30000 ms', maxWaitQueueSize=500} (org.mongodb.driver.cluster:71)
[2019-07-23 18:07:17,636] INFO Adding discovered server localhost:27017 to client view of cluster (org.mongodb.driver.cluster:71)
[2019-07-23 18:07:17,760] INFO Closing all connections to repracli/localhost:27017 (io.debezium.connector.mongodb.ConnectionContext:86)
[2019-07-23 18:07:17,768] ERROR Failed to create job for ./etc/kafka/connect-mongodb-sink.properties (org.apache.kafka.connect.cli.ConnectStandalone:104)
[2019-07-23 18:07:17,769] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:115)
java.util.concurrent.ExecutionException: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector configuration is invalid and contains the following 1 error(s):
A value is required
You can also find the above list of errors at the endpoint `/{connectorType}/config/validate`
at org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:79)
at org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:66)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:112)
Caused by: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector configuration is invalid and contains the following 1 error(s):
A value is required
You can also find the above list of errors at the endpoint `/{connectorType}/config/validate`
at org.apache.kafka.connect.runtime.AbstractHerder.maybeAddConfigErrors(AbstractHerder.java:423)
at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.putConnectorConfig(StandaloneHerder.java:188)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:109)
[2019-07-23 18:07:17,782] INFO Kafka Connect stopping (org.apache.kafka.connect.runtime.Connect:66)
[2019-07-23 18:07:17,782] INFO Stopping REST server (org.apache.kafka.connect.runtime.rest.RestServer:239)
[2019-07-23 18:07:17,790] INFO Stopped http_localhost8084@5f96f6a2{HTTP/1.1,[http/1.1]}{localhost:8084} (org.eclipse.jetty.server.AbstractConnector:341)
[2019-07-23 18:07:17,790] INFO node0 Stopped scavenging (org.eclipse.jetty.server.session:167)
[2019-07-23 18:07:17,792] INFO REST server stopped (org.apache.kafka.connect.runtime.rest.RestServer:256)
[2019-07-23 18:07:17,793] INFO Herder stopping (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:94)
[2019-07-23 18:07:17,793] INFO Worker stopping (org.apache.kafka.connect.runtime.Worker:185)
[2019-07-23 18:07:17,794] INFO Stopped FileOffsetBackingStore (org.apache.kafka.connect.storage.FileOffsetBackingStore:66)
[2019-07-23 18:07:17,796] INFO Worker stopped (org.apache.kafka.connect.runtime.Worker:206)
[2019-07-23 18:07:17,799] INFO Herder stopped (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:111)
[2019-07-23 18:07:17,800] INFO Kafka Connect stopped (org.apache.kafka.connect.runtime.Connect:71)
我曾尝试通过多种方式更改 connect-mongo-sink.properties 中的 connection.uri 来解决此问题,但效果不佳。我还搜索了一些链接,但也没有解决我的问题。
referal_link : https://groups.google.com/forum/#!topic/debezium/bC4TUld5NGw
https://github.com/confluentinc/kafka-connect-jdbc/issues/334
连接-json-standalone.properties:
key.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schema.registry.url=http://localhost:8081
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schema.registry.url=http://localhost:8081
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
连接-mongo-sink.properties:
name=mongodb-sink-connector
connector.class=io.debezium.connector.mongodb.MongoDbConnector
tasks.max=1
topics=sample-consumerr-sink-topic
type.name=kafka-connect
mongodb.hosts=repracli/localhost:27017
mongodb.collection=conn_mongo_sink_collc
mongodb.connection.uri=mongodb://localhost:27017/conn_mongo_sink_db?w=1&journal=true
我希望接收器连接器工作以便将主题数据消耗到 mongodb 集合名称 "conn_mongo_sink_collc" 中。谁能帮我解决这个错误?
注意:我正在使用 3 个副本集 mongodb,其中 port-27017 是主端口,27018-辅助端口,27019-辅助端口。
io.debezium.connector.mongodb.MongoDbConnector
是一个 Source 连接器,用于获取数据 from MongoDB into 卡夫卡。
要将数据从 MongoDB 流式传输到 Kafka,请使用 Sink 连接器。 MongoDB 最近推出了自己的 sink connector and blogged about its use,包括示例配置。
面临同样的问题。当连接器配置不完整或无效时,也会发生这种情况。
通过查看 debezium documentation
检查是否设置了必要的属性
我正在尝试编写一个 kafka 连接器以将 kafka 主题中的数据移动到 mongodb(sink)。因为我在 connect-json-standalone.properties 文件以及 kafka 文件夹中的 connect-mongo-sink.properties 文件中添加了所需的配置。在启动连接器的过程中,我遇到异常
[2019-07-23 18:07:17,274] INFO Started o.e.j.s.ServletContextHandler@76e3b45b{/,null,AVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:855)
[2019-07-23 18:07:17,274] INFO REST resources initialized; server is started and ready to handle requests (org.apache.kafka.connect.runtime.rest.RestServer:231)
[2019-07-23 18:07:17,274] INFO Kafka Connect started (org.apache.kafka.connect.runtime.Connect:56)
[2019-07-23 18:07:17,635] INFO Cluster created with settings {hosts=[localhost:27017], mode=MULTIPLE, requiredClusterType=UNKNOWN, serverSelectionTimeout='30000 ms', maxWaitQueueSize=500} (org.mongodb.driver.cluster:71)
[2019-07-23 18:07:17,636] INFO Adding discovered server localhost:27017 to client view of cluster (org.mongodb.driver.cluster:71)
[2019-07-23 18:07:17,760] INFO Closing all connections to repracli/localhost:27017 (io.debezium.connector.mongodb.ConnectionContext:86)
[2019-07-23 18:07:17,768] ERROR Failed to create job for ./etc/kafka/connect-mongodb-sink.properties (org.apache.kafka.connect.cli.ConnectStandalone:104)
[2019-07-23 18:07:17,769] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:115)
java.util.concurrent.ExecutionException: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector configuration is invalid and contains the following 1 error(s):
A value is required
You can also find the above list of errors at the endpoint `/{connectorType}/config/validate`
at org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:79)
at org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:66)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:112)
Caused by: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector configuration is invalid and contains the following 1 error(s):
A value is required
You can also find the above list of errors at the endpoint `/{connectorType}/config/validate`
at org.apache.kafka.connect.runtime.AbstractHerder.maybeAddConfigErrors(AbstractHerder.java:423)
at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.putConnectorConfig(StandaloneHerder.java:188)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:109)
[2019-07-23 18:07:17,782] INFO Kafka Connect stopping (org.apache.kafka.connect.runtime.Connect:66)
[2019-07-23 18:07:17,782] INFO Stopping REST server (org.apache.kafka.connect.runtime.rest.RestServer:239)
[2019-07-23 18:07:17,790] INFO Stopped http_localhost8084@5f96f6a2{HTTP/1.1,[http/1.1]}{localhost:8084} (org.eclipse.jetty.server.AbstractConnector:341)
[2019-07-23 18:07:17,790] INFO node0 Stopped scavenging (org.eclipse.jetty.server.session:167)
[2019-07-23 18:07:17,792] INFO REST server stopped (org.apache.kafka.connect.runtime.rest.RestServer:256)
[2019-07-23 18:07:17,793] INFO Herder stopping (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:94)
[2019-07-23 18:07:17,793] INFO Worker stopping (org.apache.kafka.connect.runtime.Worker:185)
[2019-07-23 18:07:17,794] INFO Stopped FileOffsetBackingStore (org.apache.kafka.connect.storage.FileOffsetBackingStore:66)
[2019-07-23 18:07:17,796] INFO Worker stopped (org.apache.kafka.connect.runtime.Worker:206)
[2019-07-23 18:07:17,799] INFO Herder stopped (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:111)
[2019-07-23 18:07:17,800] INFO Kafka Connect stopped (org.apache.kafka.connect.runtime.Connect:71)
我曾尝试通过多种方式更改 connect-mongo-sink.properties 中的 connection.uri 来解决此问题,但效果不佳。我还搜索了一些链接,但也没有解决我的问题。
referal_link : https://groups.google.com/forum/#!topic/debezium/bC4TUld5NGw https://github.com/confluentinc/kafka-connect-jdbc/issues/334
连接-json-standalone.properties:
key.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schema.registry.url=http://localhost:8081
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schema.registry.url=http://localhost:8081
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
连接-mongo-sink.properties:
name=mongodb-sink-connector
connector.class=io.debezium.connector.mongodb.MongoDbConnector
tasks.max=1
topics=sample-consumerr-sink-topic
type.name=kafka-connect
mongodb.hosts=repracli/localhost:27017
mongodb.collection=conn_mongo_sink_collc
mongodb.connection.uri=mongodb://localhost:27017/conn_mongo_sink_db?w=1&journal=true
我希望接收器连接器工作以便将主题数据消耗到 mongodb 集合名称 "conn_mongo_sink_collc" 中。谁能帮我解决这个错误?
注意:我正在使用 3 个副本集 mongodb,其中 port-27017 是主端口,27018-辅助端口,27019-辅助端口。
io.debezium.connector.mongodb.MongoDbConnector
是一个 Source 连接器,用于获取数据 from MongoDB into 卡夫卡。
要将数据从 MongoDB 流式传输到 Kafka,请使用 Sink 连接器。 MongoDB 最近推出了自己的 sink connector and blogged about its use,包括示例配置。
面临同样的问题。当连接器配置不完整或无效时,也会发生这种情况。 通过查看 debezium documentation
检查是否设置了必要的属性