Kafka Producer 无法在没有 PK 和 return InvalidRecordException 的情况下验证记录
Kafka Producer cannot validate record wihout PK and return InvalidRecordException
我的 kafka 生产者有错误。我使用 Debezium Kafka 连接器 V1.1.0 Final 和 Kafka 2.4.1 。对于有 pk 的表,所有表都被清楚地刷新,但不幸的是对于没有 pk 的表,它给我这个错误:
[2020-04-14 10:00:00,096] INFO Exporting data from table 'public.table_0' (io.debezium.relational.RelationalSnapshotChangeEventSource:280)
[2020-04-14 10:00:00,097] INFO For table 'public.table_0' using select statement: 'SELECT * FROM "public"."table_0"' (io.debezium.relational.RelationalSnapshotChangeEventSource:287)
[2020-04-14 10:00:00,519] INFO Finished exporting 296 records for table 'public.table_0'; total duration '00:00:00.421' (io.debezium.relational.RelationalSnapshotChangeEventSource:330)
[2020-04-14 10:00:00,522] INFO Snapshot - Final stage (io.debezium.pipeline.source.AbstractSnapshotChangeEventSource:79)
[2020-04-14 10:00:00,523] INFO Snapshot ended with SnapshotResult [status=COMPLETED, offset=PostgresOffsetContext [sourceInfo=source_info[server='postgres'db='xxx, lsn=38/C74913C0, txId=4511542, timestamp=2020-04-14T02:00:00.517Z, snapshot=FALSE, schema=public, table=table_0], partition={server=postgres}, lastSnapshotRecord=true]] (io.debezium.pipeline.ChangeEventSourceCoordinator:90)
[2020-04-14 10:00:00,524] INFO Connected metrics set to 'true' (io.debezium.pipeline.metrics.StreamingChangeEventSourceMetrics:59)
[2020-04-14 10:00:00,526] INFO Starting streaming (io.debezium.pipeline.ChangeEventSourceCoordinator:100)
[2020-04-14 10:00:00,550] ERROR WorkerSourceTask{id=pg_dev_pinjammodal-0} failed to send record to table_0: (org.apache.kafka.connect.runtime.WorkerSourceTask:347)
org.apache.kafka.common.InvalidRecordException: This record has failed the validation on broker and hence be rejected.
我查过表格,似乎是有效的记录。我在我的配置中设置了我的制作人 producer.ack=1
。这个配置是否触发这里的无效?
问题是为非 PK 表 创建带有日志压缩的 Kafka 主题,这需要键。消息没有密钥,因为表没有 PK。这会导致代理无法验证 Kafka 消息。
解决方案是不对主题设置日志压缩 and/or 而不是预先创建这些主题。另一种选择是将 PK 添加到表中。
我的 kafka 生产者有错误。我使用 Debezium Kafka 连接器 V1.1.0 Final 和 Kafka 2.4.1 。对于有 pk 的表,所有表都被清楚地刷新,但不幸的是对于没有 pk 的表,它给我这个错误:
[2020-04-14 10:00:00,096] INFO Exporting data from table 'public.table_0' (io.debezium.relational.RelationalSnapshotChangeEventSource:280)
[2020-04-14 10:00:00,097] INFO For table 'public.table_0' using select statement: 'SELECT * FROM "public"."table_0"' (io.debezium.relational.RelationalSnapshotChangeEventSource:287)
[2020-04-14 10:00:00,519] INFO Finished exporting 296 records for table 'public.table_0'; total duration '00:00:00.421' (io.debezium.relational.RelationalSnapshotChangeEventSource:330)
[2020-04-14 10:00:00,522] INFO Snapshot - Final stage (io.debezium.pipeline.source.AbstractSnapshotChangeEventSource:79)
[2020-04-14 10:00:00,523] INFO Snapshot ended with SnapshotResult [status=COMPLETED, offset=PostgresOffsetContext [sourceInfo=source_info[server='postgres'db='xxx, lsn=38/C74913C0, txId=4511542, timestamp=2020-04-14T02:00:00.517Z, snapshot=FALSE, schema=public, table=table_0], partition={server=postgres}, lastSnapshotRecord=true]] (io.debezium.pipeline.ChangeEventSourceCoordinator:90)
[2020-04-14 10:00:00,524] INFO Connected metrics set to 'true' (io.debezium.pipeline.metrics.StreamingChangeEventSourceMetrics:59)
[2020-04-14 10:00:00,526] INFO Starting streaming (io.debezium.pipeline.ChangeEventSourceCoordinator:100)
[2020-04-14 10:00:00,550] ERROR WorkerSourceTask{id=pg_dev_pinjammodal-0} failed to send record to table_0: (org.apache.kafka.connect.runtime.WorkerSourceTask:347)
org.apache.kafka.common.InvalidRecordException: This record has failed the validation on broker and hence be rejected.
我查过表格,似乎是有效的记录。我在我的配置中设置了我的制作人 producer.ack=1
。这个配置是否触发这里的无效?
问题是为非 PK 表 创建带有日志压缩的 Kafka 主题,这需要键。消息没有密钥,因为表没有 PK。这会导致代理无法验证 Kafka 消息。
解决方案是不对主题设置日志压缩 and/or 而不是预先创建这些主题。另一种选择是将 PK 添加到表中。