Confluent Kafka Sink 连接器未将数据加载到 Postgres table

Confluent Kafka Sink Connector is not loading data to Postgres table

我正在尝试通过 Kafka Sink 连接器将数据加载到 Postgres table(s),但出现以下错误:

Caused by: org.apache.kafka.connect.errors.ConnectException: Cannot ALTER to add missing field SinkRecordField{schema=Schema{STRING}, name='A_ABBREV', isPrimaryKey=false}, as it is not optional and does not have a default value

Postgres DB 中的 table 已经有字段 A_ABBREV,但现在确定为什么我会收到缺少字段的错误。

有人遇到过类似的问题吗?

下面是我的接收器连接器配置:

connector.class=io.confluent.connect.jdbc.JdbcSinkConnector
table.name.format=AGENCY
connection.password=passcode
topics=AGENCIES
tasks.max=1
batch.size=10000
fields.whitelist=A_ID, A_NAME, A_ABBREV
connection.user=pmmdevuser
name=partner5-jdbcSinkConnector
connection.url=jdbc:postgresql://aws-db.sdfdgfdrwwisc.us-east- 1.rds.amazonaws.com:3306/pmmdevdb?currentSchema=ams
insert.mode=upsert
pk.mode=record_value
pk.fields=A_ID
auto.create=false

我正在使用 Liquibase 脚本创建 tables,下面是通过 Liquibase 脚本创建的来自 postgres 数据库的创建查询:

"CREATE TABLE gds.agency
(
    a_id integer NOT NULL,
    a_name character varying(100) COLLATE pg_catalog."default" NOT NULL,
    a_abbrev character varying(8) COLLATE pg_catalog."default" NOT NULL,
    source character varying(255) COLLATE pg_catalog."default" NOT NULL DEFAULT 'AMS'::character varying,
    CONSTRAINT pk_agency PRIMARY KEY (a_id),
    CONSTRAINT a_abbrev_uk1 UNIQUE (a_abbrev)
)"

根据我的经验,这意味着接收器的字段定义与源的字段定义不匹配 table/database。确保字段定义匹配。检查接收器连接器尝试写入目标数据库的个人记录。您应该能够在堆栈跟踪中以调试模式看到此插入语句。接受该查询并运行手动查询以更清楚地了解数据库中的错误。