当前位置: 首页 > 知识库问答 >
问题:

@kafkalistener未使用消息-反序列化问题

郏佐
2023-03-14
@Component
public static class PageViewEventSource implements ApplicationRunner {

private final MessageChannel pageViewsOut;
private final Log log = LogFactory.getLog(getClass());

public PageViewEventSource(AnalyticsBinding binding) {
    this.pageViewsOut = binding.pageViewsOut();
}

@Override
public void run(ApplicationArguments args) throws Exception {
    List<String> names = Arrays.asList("priya", "dyser", "Ray", "Mark", "Oman", "Larry");
    List<String> pages = Arrays.asList("blog", "facebook", "instagram", "news", "youtube", "about");
    Runnable runnable = () -> {
        String rPage = pages.get(new Random().nextInt(pages.size()));
        String rName = pages.get(new Random().nextInt(names.size()));
        PageViewEvent pageViewEvent = new PageViewEvent(rName, rPage, Math.random() > .5 ? 10 : 1000);


        Serializer<PageViewEvent> serializer = new JsonSerde<>(PageViewEvent.class).serializer();
        byte[] m = serializer.serialize(null, pageViewEvent);

        Message<byte[]> message =  MessageBuilder
                .withPayload(m).build();

        try {
            this.pageViewsOut.send(message);
            log.info("sent " + message);
        } catch (Exception e) {
            log.error(e);
        }
    };
    Executors.newScheduledThreadPool(1).scheduleAtFixedRate(runnable, 1, 1, TimeUnit.SECONDS);
}

我试图通过Spring KafkaListener在单独的消费者应用程序中使用这些消息

@Service
public class PriceEventConsumer {


private static final Logger LOG = LoggerFactory.getLogger(PriceEventConsumer.class);

@KafkaListener(topics = "test1" , groupId = "json", containerFactory = "kafkaListenerContainerFactory")
public void receive(Bytes data){
//public void receive(@Payload PageViewEvent data,@Headers MessageHeaders headers) {
    LOG.info("Message received");
    LOG.info("received data='{}'", data);

    }

集装箱工厂配置

 @Bean
 public ConsumerFactory<String, Bytes> consumerFactory() {

  Map<String, Object> props = new HashMap<>();
  props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
  props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,
  StringDeserializer.class);
   props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, 
  BytesDeserializer.class);
  props.put(ConsumerConfig.GROUP_ID_CONFIG, "json");
  props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");

  return new DefaultKafkaConsumerFactory<>(props);

 }

 @Bean
 public ConcurrentKafkaListenerContainerFactory<String, Bytes> 

 kafkaListenerContainerFactory() {

  ConcurrentKafkaListenerContainerFactory<String, Bytes> factory =
        new ConcurrentKafkaListenerContainerFactory<>();
  factory.setConsumerFactory(consumerFactory());
  return factory;
 }

在这种配置下,使用者不接收消息(字节)。如果我将Kafka侦听器更改为接受字符串,则会出现以下异常:

   @KafkaListener(topics = "test1" , groupId = "json", containerFactory = "kafkaListenerContainerFactory")

 public void receive(String data){

    LOG.info("Message received");
    LOG.info("received data='{}'", data);

    }
  @KafkaListener(topics = "test1" , groupId = "json", containerFactory = "kafkaListenerContainerFactory")

  public void receive(@Payload PageViewEvent data,@Headers MessageHeaders headers) {
    LOG.info("Message received");
    LOG.info("received data='{}'", data);

   }

集装箱工厂配置

 @Bean
 public ConsumerFactory<String,PageViewEvent > priceEventConsumerFactory() {

    Map<String, Object> props = new HashMap<>();
    props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
    props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
    props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
    props.put(ConsumerConfig.GROUP_ID_CONFIG, "json");
    props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
    return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), new JsonDeserializer<>(PageViewEvent.class));



}

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, PageViewEvent> priceEventsKafkaListenerContainerFactory() {
      ConcurrentKafkaListenerContainerFactory<String, PageViewEvent> factory =
            new ConcurrentKafkaListenerContainerFactory<>();
     factory.setConsumerFactory(priceEventConsumerFactory());
     return factory;
     }

制片人-

  @Override
  public void run(ApplicationArguments args) throws Exception {
  List<String> names = Arrays.asList("priya", "dyser", "Ray", "Mark", "Oman", "Larry");
   List<String> pages = Arrays.asList("blog", "facebook", "instagram", "news", "youtube", "about");
   Runnable runnable = () -> {
    String rPage = pages.get(new Random().nextInt(pages.size()));
    String rName = pages.get(new Random().nextInt(names.size()));
    PageViewEvent pageViewEvent = new PageViewEvent(rName, rPage, Math.random() > .5 ? 10 : 1000);

    Message<PageViewEvent> message =  MessageBuilder
            .withPayload(pageViewEvent).build();
                    try {
        this.pageViewsOut.send(message);
        log.info("sent " + message);
    } catch (Exception e) {
        log.error(e);
    }
};

共有1个答案

李兴安
2023-03-14

对于<2.2.x版本,可以使用MessageConverter将记录从kfka反序列化为POJO

从版本2.2开始,您可以显式地配置反序列化程序,使用提供的目标类型并忽略标头中的类型信息,方法是使用一个重载的构造函数,该构造函数具有一个布尔值

@Bean
public ConsumerFactory<String,PageViewEvent > priceEventConsumerFactory() {

Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
props.put(ConsumerConfig.GROUP_ID_CONFIG, "json");
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), new JsonDeserializer<>(PageViewEvent.class,false));

}

或使用MessageConverter

 @Bean
 public ConcurrentKafkaListenerContainerFactory<String, Bytes> kafkaListenerContainerFactory() {

 ConcurrentKafkaListenerContainerFactory<String, Bytes> factory =
    new ConcurrentKafkaListenerContainerFactory<>();
 factory.setConsumerFactory(consumerFactory());
 factory.setMessageConverter(new StringJsonMessageConverter());
 return factory;
}
 类似资料:
  • 我使用Kafka Consumer API来构建Consumer。为了构建反序列化器,我实现了Deserializer类并提供了必要的实现。我收到此错误“Exception Raisedorg.apache.kafka.Common.Errors.SerializationException:错误反序列化分区staging.DataFeeds.PartnerHotel-0的键/值,偏移量为1920

  • 主要目标是聚合两个Kafka主题,一个压缩慢速移动数据,另一个每秒接收一次的快速移动数据。 我已经能够在简单的场景中使用消息,例如KV(Long, String),使用如下内容: 但是,当您需要从 AVRO 反序列化时,这似乎不是方法。我有一个KV(字符串,AVRO),我需要消费。 我尝试从AVRO模式生成Java类,然后将它们包含在“应用”中,例如: 但这似乎不是正确的方法。 是否有任何文档/示

  • 我试图阅读和打印从Kafka使用Apache Flink的原型消息。 我遵循官方文件,但没有成功:https://nightlies.apache.org/flink/flink-docs-master/docs/dev/datastream/fault-tolerance/serialization/third_party_serializers/ Flink消费者代码是: 反序列化器代码是:

  • 我使用的是spring-boot 2.3.2。使用spring-kafka->2.5.4发布。release kafka-clients->2.5.0 我有以下简单的监听器 null 如果我使用 然后它就会失败,不再有异常循环

  • 好的,我做了更改,下面是我得到的JSON响应 它现在导致一个嵌套异常是java.lang.IllegalArgumentException:参数类型不匹配 NestedServletException:请求处理失败;嵌套异常是java.lang.IllegalArgumentException:参数类型不匹配org.springframework.web.servlet.framework.ser