如何使用点网在 Kafka 中生成墓碑 Avro 记录?
How to produce a Tombstone Avro Record in Kafka using dot net?
我的sink.properties
:
{
"name": "jdbc-oracle",
"config": {
"connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
"tasks.max": "1",
"topics": "orders",
"connection.url": "jdbc:oracle:thin:@10.1.2.3:1071/orac",
"connection.user": "ersin",
"connection.password": "ersin!",
"auto.create": "true",
"delete.enabled": "true",
"pk.mode": "record_key",
"pk.fields": "id",
"insert.mode": "upsert",
"plugin.path": "/home/ersin/confluent-5.4.1/share/java/",
"name": "jdbc-oracle"
},
"tasks": [
{
"connector": "jdbc-oracle",
"task": 0
}
],
"type": "sink"
}
我的connect-avro-distributed.properties
:
bootstrap.servers=10.0.0.0:9092
group.id=connect-cluster
key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://10.0.0.0:8081
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://10.0.0.0:8081
config.storage.topic=connect-configs
offset.storage.topic=connect-offsets
status.storage.topic=connect-statuses
config.storage.replication.factor=1
offset.storage.replication.factor=1
status.storage.replication.factor=1
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
代码
var messageToSend = new Message <GenericRecord,GenericRecord>
{
Key=recordKey
//,Value=recordValue
};
当我想发送带有空值的数据时出现错误 (null reference
)。
如何解决这个错误?
提前致谢
var tombstoner = new ProducerBuilder<int, Null>(_kafkaConfiguration.ProducerConfiguration)
.SetKeySerializer(new AvroSerializer<int>(_schemaRegistryClient))
.SetValueSerializer(Serializers.Null)
.Build();
var tasks = properties.Select(property => tombstoner.ProduceAsync(
"yourTopicName",
new Message<int, Null> {
Key = 100,
Value = null,
Timestamp = Timestamp.Default
}
));
但是请注意,目前无法使用 confluent-kafka-dotnet
客户端使用逻辑删除记录。
我的sink.properties
:
{
"name": "jdbc-oracle",
"config": {
"connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
"tasks.max": "1",
"topics": "orders",
"connection.url": "jdbc:oracle:thin:@10.1.2.3:1071/orac",
"connection.user": "ersin",
"connection.password": "ersin!",
"auto.create": "true",
"delete.enabled": "true",
"pk.mode": "record_key",
"pk.fields": "id",
"insert.mode": "upsert",
"plugin.path": "/home/ersin/confluent-5.4.1/share/java/",
"name": "jdbc-oracle"
},
"tasks": [
{
"connector": "jdbc-oracle",
"task": 0
}
],
"type": "sink"
}
我的connect-avro-distributed.properties
:
bootstrap.servers=10.0.0.0:9092
group.id=connect-cluster
key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://10.0.0.0:8081
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://10.0.0.0:8081
config.storage.topic=connect-configs
offset.storage.topic=connect-offsets
status.storage.topic=connect-statuses
config.storage.replication.factor=1
offset.storage.replication.factor=1
status.storage.replication.factor=1
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
代码
var messageToSend = new Message <GenericRecord,GenericRecord>
{
Key=recordKey
//,Value=recordValue
};
当我想发送带有空值的数据时出现错误 (null reference
)。
如何解决这个错误?
提前致谢
var tombstoner = new ProducerBuilder<int, Null>(_kafkaConfiguration.ProducerConfiguration)
.SetKeySerializer(new AvroSerializer<int>(_schemaRegistryClient))
.SetValueSerializer(Serializers.Null)
.Build();
var tasks = properties.Select(property => tombstoner.ProduceAsync(
"yourTopicName",
new Message<int, Null> {
Key = 100,
Value = null,
Timestamp = Timestamp.Default
}
));
但是请注意,目前无法使用 confluent-kafka-dotnet
客户端使用逻辑删除记录。