无法解码密钥的 json 类型:Spring 云数据流中的 file_name

Could not decode json type for key: file_name in a Spring Cloud Data Flow stream

我使用 Spring Cloud Data Flow 设置读取 CSV 文件的流,使用自定义处理器转换它并记录它:

stream create --name testsourcecsv --definition "file --mode=lines --directory=D:/toto/ --file.filename-pattern=adresses-28.csv --maxMessages=1000 | csvToMap --spring.cloud.stream.bindings.output.content-type=application/json | log --spring.cloud.stream.bindings.input.content-type=application/json" --deploy

文件和 csvToMap 应用程序工作正常,但在日志应用程序中我看到这种异常,对于每条记录:

2019-12-03 11:32:46.500 ERROR 1328 --- [container-0-C-1] o.s.c.s.b.k.KafkaMessageChannelBinder  : Could not decode json type: adresses-28.csv for key: file_name

com.fasterxml.jackson.core.JsonParseException: Unrecognized token 'adresses': was expecting ('true', 'false' or 'null')
 at [Source: (byte[])"adresses-28.csv"; line: 1, column: 10]
    at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1804) ~[jackson-core-2.9.9.jar!/:2.9.9]
    at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:703) ~[jackson-core-2.9.9.jar!/:2.9.9]
    at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._reportInvalidToken(UTF8StreamJsonParser.java:3532) ~[jackson-core-2.9.9.jar!/:2.9.9]
    at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._handleUnexpectedValue(UTF8StreamJsonParser.java:2627) ~[jackson-core-2.9.9.jar!/:2.9.9]
    at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._nextTokenNotInObject(UTF8StreamJsonParser.java:832) ~[jackson-core-2.9.9.jar!/:2.9.9]
    at com.fasterxml.jackson.core.json.UTF8StreamJsonParser.nextToken(UTF8StreamJsonParser.java:729) ~[jackson-core-2.9.9.jar!/:2.9.9]
    at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:4141) ~[jackson-databind-2.9.9.jar!/:2.9.9]
    at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4000) ~[jackson-databind-2.9.9.jar!/:2.9.9]
    at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3091) ~[jackson-databind-2.9.9.jar!/:2.9.9]
    at org.springframework.cloud.stream.binder.kafka.BinderHeaderMapper.lambda$toHeaders(BinderHeaderMapper.java:268) ~[spring-cloud-stream-binder-kafka-2.1.4.RELEASE.jar!/:2.1.4.RELEASE]
    at java.lang.Iterable.forEach(Iterable.java:75) ~[na:1.8.0_202]
    at org.springframework.cloud.stream.binder.kafka.BinderHeaderMapper.toHeaders(BinderHeaderMapper.java:251) ~[spring-cloud-stream-binder-kafka-2.1.4.RELEASE.jar!/:2.1.4.RELEASE]

file_relativePath header 也会引发此异常。我不明白为什么 spring-kafka 试图将它们读作 JSON。

此外,log-sink 以正确的方式记录我的记录:

2019-12-03 11:32:46.516  INFO 1328 --- [container-0-C-1] log-sink                                 : {"code_postal":"28200","id_fantoir":"28211_0127","source_nom_voie":"inconnue","numero":"1","code_insee":28211,"lon":1.260462,"code_insee_ancienne_commune":"","nom_afnor":"RUE DU VIEUX MOULIN","nom_voie":"Rue du Vieux Moulin","nom_ld":"","libelle_acheminement":"LOGRON","source_position":"inconnue","nom_commune":"Logron","nom_ancienne_commune":"","x":570633.27,"y":6784246.2,"alias":"","id":"28211_0127_00001","rep":"","lat":48.145756}

我为了调试目的在我的 csvToMap 处理器中记录了 kafka headers,给我:

2019-12-03 11:32:37.042  INFO 10788 --- [container-0-C-1] c.d.streams.processor.CsvToMapProcessor  : headers {sequenceNumber=152963, file_name=adresses-28.csv, sequenceSize=0, deliveryAttempt=1, kafka_timestampType=CREATE_TIME, file_originalFile=NonTrustedHeaderType [headerValue="D:\toto\adresses-28.csv", untrustedType=java.io.File], kafka_receivedMessageKey=null, kafka_receivedTopic=testsourcecsv.file, file_relativePath=adresses-28.csv, kafka_offset=430949, scst_nativeHeadersPresent=true, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer@7c3e63db, correlationId=9547c02d-e617-d981-f9b5-8df231530f66, kafka_receivedPartitionId=0, contentType=text/plain, kafka_receivedTimestamp=1575299282558, kafka_groupId=testsourcecsv}

所以我完全不明白为什么 log-sink 试图解码 file_name 和 file_relativePath headers.

我设置了一个本地环境:

我的 csvToMap 处理器定义如下:

    @Component
    public class CsvToMapProcessor {
        private static final Logger LOGGER = LoggerFactory.getLogger(CsvToMapProcessor.class);

        @Autowired
        @Qualifier("csvMapper")
        private ObjectReader csvMapper;

        @Autowired
        @Qualifier("jsonWriter")
        private ObjectWriter jsonWriter;

        @Transformer(inputChannel = Processor.INPUT, outputChannel = Processor.OUTPUT)
        public Map<String, Object> transform(String csvLine, @Headers Map<String, Object> headers) {
            try {
                LOGGER.info("headers {}", headers);
                Map<String, Object> map = csvMapper.readValue(csvLine);
                return map;
            } catch (JsonProcessingException e) {
                LOGGER.error("An error occurs while reading CSV line {} : {}", csvLine, e.getMessage());
                LOGGER.debug(e.getMessage(), e);
                return null;
            }
        }
    }

用这个 parent :

    <parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>2.2.1.RELEASE</version>
        <relativePath/> <!-- lookup parent from repository -->
    </parent>

而这个 Spring 云版本:

<spring-cloud.version>Hoxton.RELEASE</spring-cloud.version>

我做错了什么导致了这个问题?

您在 --file.filename-pattern 中设置的值似乎有问题。你能检查一下你是否确实传递了符合 AntPathMatcher 的值(文件名模式 属性 基于此路径匹配器)?

如果你尝试 --file.filename-pattern=*.csv 会怎样?

所以 :

  • 与 spring-cloud Hoxton 版本相结合,spring-cloud-stream 版本为 3.0。0.RELEASE :
[INFO] +- org.springframework.cloud:spring-cloud-starter-stream-kafka:jar:3.0.0.RELEASE:compile
[INFO] |  \- org.springframework.cloud:spring-cloud-stream-binder-kafka:jar:3.0.0.RELEASE:compile
[INFO] |     +- org.springframework.cloud:spring-cloud-stream-binder-kafka-core:jar:3.0.0.RELEASE:compile
[INFO] |     |  \- org.springframework.integration:spring-integration-kafka:jar:3.2.1.RELEASE:compile
[INFO] |     \- org.springframework.kafka:spring-kafka:jar:2.3.3.RELEASE:compile
  • log-sink-app 2.1.2 使用 spring-cloud-stream v 2.1.4.RELEASE :
[INFO] +- org.springframework.cloud:spring-cloud-starter-stream-kafka:jar:2.1.4.RELEASE:compile
[INFO] |  \- org.springframework.cloud:spring-cloud-stream-binder-kafka:jar:2.1.4.RELEASE:compile
[INFO] |     +- org.springframework.cloud:spring-cloud-stream-binder-kafka-core:jar:2.1.4.RELEASE:compile
[INFO] |     |  \- org.springframework.integration:spring-integration-kafka:jar:3.1.0.RELEASE:compile
[INFO] |     \- org.springframework.kafka:spring-kafka:jar:2.2.8.RELEASE:compile

正如 spring-kafka 2.3.3 documentation DefaultKafkaHeaderMapper.setEncodeStrings 方法所说:

Set to true if a consumer of the outbound record is using Spring for Apache Kafka version less than 2.3

log-sink 应用实际上使用 spring-kafka v 2.2.8,因此我必须使用自定义 header 映射器将其设置为 true :

    @Bean("kafkaBinderHeaderMapper")
    public KafkaHeaderMapper kafkaBinderHeaderMapper() {
        DefaultKafkaHeaderMapper mapper = new DefaultKafkaHeaderMapper();
        mapper.setEncodeStrings(true);
        return mapper;
    }

但是如果我这样做,日志接收器不会记录任何内容,因为它无法理解由 DefaultKafkaHeaderMapper 编码的内容类型 header。团队提供了 BinderHeaderMapper 来解决这个问题:

Custom header mapper for Apache Kafka. This is identical to the DefaultKafkaHeaderMapper from spring Kafka. This is provided for addressing some interoperability issues between Spring Cloud Stream 3.0.x and 2.x apps, where mime types passed as regular MimeType in the header are not de-serialized properly

所以我必须在我的应用程序中配置自定义 BinderHeaderMapper :

    @Bean("kafkaBinderHeaderMapper")
    public KafkaHeaderMapper kafkaBinderHeaderMapper() {
        BinderHeaderMapper mapper = new BinderHeaderMapper();
        mapper.setEncodeStrings(true);
        return mapper;
    }

一切正常。