Spring Cloud DataFlow - 无法转换来自 Kafka 的消息

Spring Cloud DataFlow - could not convert messages from Kafka

使用具有以下配置的 Spring Cloud DataFlow 版本 1.2.2:

spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.binders.kafka1.type=kafka
spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.binders.kafka1.environment.spring.cloud.stream.kafka.binder.brokers=<MY_BROKER>
spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.binders.kafka1.environment.spring.cloud.stream.kafka.binder.zkNodes=<MY_ZK>

我正在尝试创建一个流,该流将从特定主题读取并将其冲入长接收器,如下所示:

stream create --name metricsStream --definition ":metrics --spring.cloud.stream.bindings.input.binder=kafka1 --spring.cloud.stream.bindings.output.content-type='text/plain;charset=UTF-8' > bridge | log" --deploy

查看日志文件我可以看到以下错误:

2017-07-17 09:44:01,700  INFO -kafka-listener-1 log-sink:202 - [B@79d0a6b6 2017-07-17 09:44:01,700 ERROR -kafka-listener-1 o.s.c.s.b.k.KafkaMessageChannelBinder:283 - Could not convert message: 7B226D657472696354696D657374616D70223A313530303233373037302C226D65747269634E616D65223A22636577632E7265636F6E6E61697373616E63655F616E645F7363616E6E696E672E64726F70735F7065725F65787465726E616C5F736F757263655F69702E3131335F32395F3233365F313136222C224074696D657374616D70223A22323031372D30372D31365432303A33313A32352E3438325A222C22706F7274223A33363133302C226D657472696356616C7565223A302C224076657273696F6E223A2231222C22686F7374223A223137322E32362E312E313135222C226D657373616765223A22636577632E7265636F6E6E61697373616E63655F616E645F7363616E6E696E672E64726F70735F7065725F65787465726E616C5F736F757263655F69702E3131335F32395F3233365F31313620302031353030323337303730227D java.lang.StringIndexOutOfBoundsException: String index out of range: 380

我还尝试为 kafka 源consumer/producer配置一些属性

stream create --name metricsStream --definition ":metrics --spring.kafka.consumer.valueDerserializer=org.apache.kafka.common.serialization.StringDeserializer --spring.cloud.stream.bindings.input.binder=kafka1 --spring.cloud.stream.bindings.output.content-type='text/plain;charset=UTF-8' --spring.cloud.stream.bindings.input.consumer.headerMode=raw --spring.cloud.stream.bindings.output.producer.headerMode=raw  --outputType='text/plain;charset=UTF-8' > bridge | log" --deploy

但我得到了相同的结果

这是由 Spring DataFlow 打印的消费者详细信息:

2017-07-17 09:43:57,267  INFO main o.a.k.c.c.ConsumerConfig:180 - ConsumerConfig values:    auto.commit.interval.ms = 100   auto.offset.reset = earliest    bootstrap.servers = [172.26.1.63:9092]  check.crcs = true   client.id = consumer-2  connections.max.idle.ms = 540000    enable.auto.commit = false  exclude.internal.topics = true  fetch.max.bytes = 52428800  fetch.max.wait.ms = 500     fetch.min.bytes
= 1     group.id = metrics_KafkaToHdfs_5    heartbeat.interval.ms = 3000    interceptor.classes = null  key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer    max.partition.fetch.bytes = 1048576     max.poll.interval.ms = 300000   max.poll.records = 500  metadata.max.age.ms = 300000    metric.reporters = []   metrics.num.samples = 2     metrics.sample.window.ms = 30000    partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]     receive.buffer.bytes = 65536    reconnect.backoff.ms = 50   request.timeout.ms = 305000     retry.backoff.ms = 100  sasl.kerberos.kinit.cmd = /usr/bin/kinit    sasl.kerberos.min.time.before.relogin = 60000   sasl.kerberos.service.name = null   sasl.kerberos.ticket.renew.jitter
= 0.05  sasl.kerberos.ticket.renew.window.factor = 0.8  sasl.mechanism = GSSAPI     security.protocol = PLAINTEXT   send.buffer.bytes = 131072  session.timeout.ms = 10000  ssl.cipher.suites = null    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]   ssl.endpoint.identification.algorithm = null    ssl.key.password = null     ssl.keymanager.algorithm = SunX509  ssl.keystore.location = null    ssl.keystore.password = null    ssl.keystore.type = JKS     ssl.protocol = TLS  ssl.provider = null     ssl.secure.random.implementation = null     ssl.trustmanager.algorithm = PKIX   ssl.truststore.location = null  ssl.truststore.password = null  ssl.truststore.type = JKS   value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer

我看到了类似的问题,但没有有效答案 what is the property to accept binary json message in spring-cloud-stream kafka binder

我的 Kafka 指标主题包含 JSON 行。 我应该如何配置 Spring DataFlow 才能以 JSON 格式(或至少看起来像 JSON 的字符串格式?

读取 Kafka 的主题

您是否尝试过配置输入内容类型?

spring.cloud.stream.bindings.input.content-type=application/json

或者可能使用来自 Spring Cloud Dataflow 的前缀:

spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.binders.kafka1.environment.spring.cloud.stream.bindings.input.content-type=application/json