可用:预计至少有 1 个符合自动装配候选条件的 bean

available: expected at least 1 bean which qualifies as autowire candidate

我想实现发送和接收 Java 序列化对象的 Kafka 生产者。我试过这个:

制作人:

@Configuration
public class KafkaProducerConfig {

    @Value(value = "${kafka.bootstrapAddress}")
    private String bootstrapAddress;

    @Bean
    public ProducerFactory<String, String> producerFactory() {
        Map<String, Object> configProps = new HashMap<>();
        configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
        configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, SaleRequestFactory.class);
        configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, SaleRequestFactory.class);
        return new DefaultKafkaProducerFactory<>(configProps);
    }

    @Bean
    public KafkaTemplate<String, String> kafkaTemplate() {
        return new KafkaTemplate<>(producerFactory());
    }
}

发送对象:

@Autowired
private KafkaTemplate<String, SaleRequestFactory> kafkaTemplate;

private static String topic = "tp-sale";

private void perform(){

    Transaction transaction = new Transaction();
    transaction.setStatus(PaymentTransactionStatus.IN_PROGRESS.getText());

    Transaction insertedTransaction = transactionService.save(transaction);

    SaleRequestFactory obj = new SaleRequestFactory();
    obj.setId(100);

    ListenableFuture<SendResult<String, SaleRequestFactory>> send = kafkaTemplate.send(topic, obj);
}

消费者:

@EnableKafka
@Configuration
public class KafkaConsumerConfig {

    @Value(value = "${kafka.bootstrapAddress}")
    private String bootstrapAddress;

    private String groupId = "test";

    @Bean
    public ConsumerFactory<String, String> consumerFactory() {
        Map<String, Object> props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
        props.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, SaleRequestFactory.class);
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, SaleRequestFactory.class);
        return new DefaultKafkaConsumerFactory<>(props);
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, String>
    kafkaListenerContainerFactory() {

        ConcurrentKafkaListenerContainerFactory<String, String> factory =
                new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactory());
        return factory;
    }
}

//接收对象

    private static String topic = "tp-sale";

    @KafkaListener(topics = "tp-sale")
    public SaleResponseFactory transactionElavonAuthorizeProcess(@Payload SaleRequestFactory tf, @Headers MessageHeaders headers) throws Exception {

        System.out.println(tf.getId());

        SaleResponseFactory resObj = new SaleResponseFactory();
        resObj.setUnique_id("123123");

        return resObj;
    }

部署 Producer 时出现错误:

Application failed to start due to an exception org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'org.springframework.kafka.core.KafkaTemplate<java.lang.String, org.engine.plugin .transactions.factory.SaleRequestFactory>' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {@org.springframework.beans.fac tory.annotation.Autowired(required=true)}

你知道我该如何解决这个问题吗?

编辑: 我设法实现了这些改进:

制作人:

@Configuration
public class KafkaProducerConfig {

@Value(value = "${kafka.bootstrapAddress}")
private String bootstrapAddress;

@Bean
public ProducerFactory<String, SaleRequestFactory> saleRequestFactoryProducerFactory() {
    Map<String, Object> configProps = new HashMap<>();
    configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
    configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
    configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, SaleRequestFactorySerializer.class);
    return new DefaultKafkaProducerFactory<>(configProps);
}

@Bean
public ProducerFactory<String, String> producerFactory() {
    Map<String, Object> configProps = new HashMap<>();
    configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
    return new DefaultKafkaProducerFactory<>(configProps);
}


@Bean
public KafkaTemplate<String, SaleRequestFactory> saleRequestFactoryKafkaTemplate() {
    return new KafkaTemplate<>(saleRequestFactoryProducerFactory());
}

@Bean
public KafkaTemplate<String, String> kafkaTemplate() {
    return new KafkaTemplate<>(producerFactory());
}

}

发送对象:

@Autowired
private KafkaTemplate<String, SaleRequestFactory> saleRequestFactoryKafkaTemplate;

private static String topic = "tp-sale";

private void perform(){

    SaleRequestFactory obj = new SaleRequestFactory();
    obj.setId(100);

    ListenableFuture<SendResult<String, SaleRequestFactory>> send = saleRequestFactoryKafkaTemplate.send(topic, obj);
}

消费者:

@EnableKafka
@Configuration
public class KafkaConsumerConfig {

    @Value(value = "${kafka.bootstrapAddress}")
    private String bootstrapAddress;

    private String groupId = "test";

    @Bean
    public ConsumerFactory<String, SaleResponseFactory> consumerFactory() {
        Map<String, Object> props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
        props.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, SaleResponseFactoryDeserializer.class);
        return new DefaultKafkaConsumerFactory<>(props);
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, SaleResponseFactory> kafkaListenerContainerFactory() {
        ConcurrentKafkaListenerContainerFactory<String, SaleResponseFactory> factory =
                new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactory());
        return factory;
    }
}

//接收对象

    @KafkaListener(topics = "tp-sale")
public SaleResponseFactory transactionElavonAuthorizeProcess(@Payload SaleRequestFactory tf, @Headers MessageHeaders headers) throws Exception {

    System.out.println(tf.getId());

    SaleResponseFactory resObj = new SaleResponseFactory();
    resObj.setUnique_id("123123");

    return resObj;
}

自定义对象

    @Getter
    @Setter
    @NoArgsConstructor
    @AllArgsConstructor
    @Builder(toBuilder = true)
    public class SaleRequestFactory implements Serializable{
    
        private static final long serialVersionUID = 1744050117179344127L;
        
        private int id;
    }

public class SaleRequestFactorySerializer implements Serializable, Serializer<SaleRequestFactory> {

    @Override
    public byte[] serialize(String topic, SaleRequestFactory data) {
        // convert data to byte[]
        ByteArrayOutputStream out = new ByteArrayOutputStream();
        try
        {
            ObjectOutputStream outputStream = new ObjectOutputStream(out);
            outputStream.writeObject(data);
            out.close();
        }
        catch (IOException e)
        {
            e.printStackTrace();
        }

        return out.toByteArray();
    }
}


    @Getter
    @Setter
    @NoArgsConstructor
    @AllArgsConstructor
    @Builder(toBuilder = true)
    public class SaleResponseFactory implements Serializable{
    
        private static final long serialVersionUID = 1744050117179344127L;
    
        private String unique_id;
    }

public class SaleResponseFactoryDeserializer implements Serializable, Deserializer<SaleResponseFactory> {

    @Override
    public SaleResponseFactory deserialize(String topic, byte[] data) {
        // convert data to SaleResponseFactory
        SaleResponseFactory saleResponseFactory = null;
        try
        {
            ByteArrayInputStream bis = new ByteArrayInputStream(data);
            ObjectInputStream in = new ObjectInputStream(bis);
            saleResponseFactory = (SaleResponseFactory) in.readObject();
            in.close();
        }
        catch (IOException | ClassNotFoundException e)
        {
            e.printStackTrace();
        }
        return saleResponseFactory;
    }
}

当我发送消息时出现错误:

13:03:53.675 [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] DEBUG RecordMessagingMessageListenerAdapter[debug:296] - Listener method returned result [org.factory.SaleResponseFactory@69c400ab] - generating response message for it
13:03:53.675 [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] DEBUG RecordMessagingMessageListenerAdapter[debug:296] - No replyTopic to handle the reply: org.factory.SaleResponseFactory@69c400ab

你知道我怎么解决这个问题吗?

你唯一的 KafkaTemplate bean 是

@Bean
public KafkaTemplate<String, String> kafkaTemplate() {
    return new KafkaTemplate<>(producerFactory());
}

尝试使用所需的泛型声明另一个 bean:

@Bean
public KafkaTemplate<String, SaleRequestFactory> saleRequestFactoryKafkaTemplate() {
    return new KafkaTemplate<>(producerFactory());
}

你还需要对应的ProducerFactory:

@Bean
public ProducerFactory<String, SaleRequestFactory> saleRequestFactoryProducerFactory() {
    Map<String, Object> configProps = new HashMap<>();
    configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);

    // Serialization configuration
    configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
    configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);

    return new DefaultKafkaProducerFactory<>(configProps);
}

回答更新部分 - 我认为调试消息具有误导性,因为您没有选择将结果转发到另一个主题。更多详细信息 here. Everything appears to be working as expected. I think there needs to be a way to skip the entire forwarding including logging when the listener return type is not Message or @SendTo annotation is used which is required for forwarding. I've dropped a comment spring kafka committer 看看他的想法。