当前位置: 首页 > 知识库问答 >
问题:

KafkaAvro反序列化程序无法将Kafka消息反序列化到特定的Avro记录

沈英勋
2023-03-14

我试图将Kafka中的Avro消息反序列化为从Avro模式生成的POJO。我正在使用Kafkaavroderializer进行此转换。

我可以在消费者记录中看到通用记录

设置: Avro-1.9.1融合-5.4商业中心gradle插件0.20.0

在尝试反序列化Avro消息时,我得到的错误如下-

Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id 66
Caused by: java.lang.ClassCastException: java.lang.Integer cannot be cast to java.time.LocalDate
    at com.sample.Data.put(Data.java:229) ~[main/:na]
    at org.apache.avro.generic.GenericData.setField(GenericData.java:795) ~[avro-1.9.1.jar:1.9.1]
    at org.apache.avro.specific.SpecificDatumReader.readField(SpecificDatumReader.java:139) ~[avro-1.9.1.jar:1.9.1]
    at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:237) ~[avro-1.9.1.jar:1.9.1]
    at org.apache.avro.specific.SpecificDatumReader.readRecord(SpecificDatumReader.java:123) ~[avro-1.9.1.jar:1.9.1]
    at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:170) ~[avro-1.9.1.jar:1.9.1]
    at org.apache.avro.specific.SpecificDatumReader.readField(SpecificDatumReader.java:136) ~[avro-1.9.1.jar:1.9.1]
    at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:237) ~[avro-1.9.1.jar:1.9.1]
    at org.apache.avro.specific.SpecificDatumReader.readRecord(SpecificDatumReader.java:123) ~[avro-1.9.1.jar:1.9.1]
    at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:170) ~[avro-1.9.1.jar:1.9.1]
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:151) ~[avro-1.9.1.jar:1.9.1]
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:144) ~[avro-1.9.1.jar:1.9.1]
    at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer$DeserializationContext.read(AbstractKafkaAvroDeserializer.java:287) ~[kafka-avro-serializer-5.4.0.jar:na]
    at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:102) ~[kafka-avro-serializer-5.4.0.jar:na]
    at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:81) ~[kafka-avro-serializer-5.4.0.jar:na]
    at io.confluent.kafka.serializers.KafkaAvroDeserializer.deserialize(KafkaAvroDeserializer.java:55) ~[kafka-avro-serializer-5.4.0.jar:na]
    at org.apache.kafka.common.serialization.Deserializer.deserialize(Deserializer.java:60) ~[kafka-clients-2.5.0.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:1310) ~[kafka-clients-2.5.0.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher.access$3500(Fetcher.java:128) ~[kafka-clients-2.5.0.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher$CompletedFetch.fetchRecords(Fetcher.java:1541) ~[kafka-clients-2.5.0.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher$CompletedFetch.access$1700(Fetcher.java:1377) ~[kafka-clients-2.5.0.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher.fetchRecords(Fetcher.java:677) ~[kafka-clients-2.5.0.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher.fetchedRecords(Fetcher.java:632) ~[kafka-clients-2.5.0.jar:na]
    at org.apache.kafka.clients.consumer.KafkaConsumer.pollForFetches(KafkaConsumer.java:1290) ~[kafka-clients-2.5.0.jar:na]
    at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1248) ~[kafka-clients-2.5.0.jar:na]
    at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1216) ~[kafka-clients-2.5.0.jar:na]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doPoll(KafkaMessageListenerContainer.java:1091) [spring-kafka-2.5.2.RELEASE.jar:2.5.2.RELEASE]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:1047) [spring-kafka-2.5.2.RELEASE.jar:2.5.2.RELEASE]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:972) [spring-kafka-2.5.2.RELEASE.jar:2.5.2.RELEASE]
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_241]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_241]
    at java.lang.Thread.run(Thread.java:748) [na:1.8.0_241]

使用ClassCastException

{
    "name": "BIRTH_DT",
    "type": [
        "null",
        {
            "type": "int",
            "logicalType": "date"
        }
    ],
    "default": null
}

从生成的POJO代码片段

  @Deprecated public java.time.LocalDate BIRTH_DT;

  // Used by DatumReader.  Applications should not call.
  @SuppressWarnings(value="unchecked")
  public void put(int field$, java.lang.Object value$) {
    switch (field$) {
    .
    .
    case 8: BIRTH_DT = (java.time.LocalDate)value$; break;
    default: throw new org.apache.avro.AvroRuntimeException("Bad index");
    }
  }

  public java.time.LocalDate getBIRTHDT() {
    return BIRTH_DT;
  }

  public void setBIRTHDT(java.time.LocalDate value) {
      this.BIRTH_DT = value;
  }

Kafka消费者法

    @KafkaListener(topics = "${spring.kafka.consumer.properties.topic}",
                     groupId = "${spring.kafka.consumer.group-id}")
    // Data is a POJO generated by Avro tools
    public void consume(ConsumerRecord<String, Data> record,
                        @Header(KafkaHeaders.RECEIVED_PARTITION_ID) Integer partition,
                        @Header(KafkaHeaders.OFFSET) Long offset, Acknowledgment ack) throws IOException {
    
        logger.info(String.format("#### -> Consumed message -> partiion: %s, offset: %s", partition, offset));
        Data row = record.value();
        ack.acknowledge();
    }

建筑格拉德尔

buildscript {
    repositories {
        jcenter {
            url "https://nexus.abc.com:8443/content/repositories/jcenter/"
        }
    }
    dependencies {
        classpath "com.commercehub.gradle.plugin:gradle-avro-plugin:0.20.0"
    }
}

plugins {
    id 'org.springframework.boot' version '2.3.1.RELEASE'
    id 'io.spring.dependency-management' version '1.0.9.RELEASE'
    id 'java'
    id 'idea'
    id 'eclipse'
}

repositories {
    maven { url nexusPublicRepoURL }
    maven { url "https://nexus.abc.com:8443/content/repositories/confluence.io-maven/" }
    jcenter()
    maven { url "https://nexus.abc.com:8443/content/repositories/jcenter/" }
}

group = 'com.abc.cscm'
version = '0.0.1-SNAPSHOT'
sourceCompatibility = '8'
targetCompatibility = '8'

ext {
    springCloudVersion = 'Hoxton.SR6'
    confluentVersion = '5.4.0'
}

dependencies {
    implementation 'org.springframework.boot:spring-boot-starter-actuator'
    implementation 'org.springframework.boot:spring-boot-starter-web'
    implementation 'org.springframework.kafka:spring-kafka'

    implementation "io.confluent:kafka-avro-serializer:${confluentVersion}"

    implementation 'org.apache.avro:avro:1.9.1'

    testImplementation('org.springframework.boot:spring-boot-starter-test') {
        exclude group: 'org.junit.vintage', module: 'junit-vintage-engine'
    }
    testImplementation 'org.springframework.kafka:spring-kafka-test'
}

springBoot {
    buildInfo()
}

dependencyManagement {
    imports {
        mavenBom "org.springframework.cloud:spring-cloud-dependencies:${springCloudVersion}"
    }
}

test {
    useJUnitPlatform()
}

wrapper {
    distributionUrl = "https://nexus.abc.com:8443/service/local/repositories/thirdparty/content/org/gradle/gradle/6.5/gradle-6.5.zip"
}

apply plugin: "com.commercehub.gradle.plugin.avro"
apply plugin: 'idea'

./gradlew依赖--配置编译类路径(输出)

> Task :dependencies

------------------------------------------------------------
Root project
------------------------------------------------------------

compileClasspath - Compile classpath for source set 'main'.
                        ** omiting spring deps
+--- io.confluent:kafka-avro-serializer:5.4.0
|    +--- org.apache.avro:avro:1.9.1
|    |    +--- com.fasterxml.jackson.core:jackson-core:2.9.9 -> 2.11.0
|    |    +--- com.fasterxml.jackson.core:jackson-databind:2.9.9.3 -> 2.11.0 (*)
|    |    +--- org.apache.commons:commons-compress:1.19
|    |    \--- org.slf4j:slf4j-api:1.7.25 -> 1.7.30
|    +--- io.confluent:kafka-schema-registry-client:5.4.0
|    |    +--- org.apache.kafka:kafka-clients:5.4.0-ccs -> 2.5.0 (*)
|    |    +--- io.confluent:common-config:5.4.0
|    |    |    +--- io.confluent:common-utils:5.4.0
|    |    |    |    \--- org.slf4j:slf4j-api:1.7.26 -> 1.7.30
|    |    |    \--- org.slf4j:slf4j-api:1.7.26 -> 1.7.30
|    |    +--- org.apache.avro:avro:1.9.1 (*)
|    |    +--- com.fasterxml.jackson.core:jackson-databind:2.9.10.1 -> 2.11.0 (*)
|    |    +--- io.swagger:swagger-annotations:1.5.22
|    |    +--- io.swagger:swagger-core:1.5.3
|    |    |    +--- org.apache.commons:commons-lang3:3.2.1 -> 3.10
|    |    |    +--- org.slf4j:slf4j-api:1.6.3 -> 1.7.30
|    |    |    +--- com.fasterxml.jackson.core:jackson-annotations:2.4.5 -> 2.11.0
|    |    |    +--- com.fasterxml.jackson.core:jackson-databind:2.4.5 -> 2.11.0 (*)
|    |    |    +--- com.fasterxml.jackson.datatype:jackson-datatype-joda:2.4.5 -> 2.11.0
|    |    |    |    +--- com.fasterxml.jackson.core:jackson-core:2.11.0
|    |    |    |    \--- joda-time:joda-time:2.9.9
|    |    |    +--- com.fasterxml.jackson.dataformat:jackson-dataformat-yaml:2.4.5 -> 2.11.0
|    |    |    |    +--- com.fasterxml.jackson.core:jackson-databind:2.11.0 (*)
|    |    |    |    +--- org.yaml:snakeyaml:1.26
|    |    |    |    \--- com.fasterxml.jackson.core:jackson-core:2.11.0
|    |    |    +--- io.swagger:swagger-models:1.5.3
|    |    |    |    +--- com.fasterxml.jackson.core:jackson-annotations:2.4.5 -> 2.11.0
|    |    |    |    +--- org.slf4j:slf4j-api:1.6.3 -> 1.7.30
|    |    |    |    \--- io.swagger:swagger-annotations:1.5.3 -> 1.5.22
|    |    |    \--- com.google.guava:guava:18.0 -> 29.0-android
|    |    |         +--- com.google.guava:failureaccess:1.0.1
|    |    |         +--- com.google.guava:listenablefuture:9999.0-empty-to-avoid-conflict-with-guava
|    |    |         +--- com.google.code.findbugs:jsr305:3.0.2
|    |    |         +--- org.checkerframework:checker-compat-qual:2.5.5
|    |    |         +--- com.google.errorprone:error_prone_annotations:2.3.4
|    |    |         \--- com.google.j2objc:j2objc-annotations:1.3
|    |    \--- io.confluent:common-utils:5.4.0 (*)
|    +--- io.confluent:common-config:5.4.0 (*)
|    \--- io.confluent:common-utils:5.4.0 (*)
\--- org.apache.avro:avro:1.9.1 (*)

./gradlew构建环境(输出)

classpath
+--- com.commercehub.gradle.plugin:gradle-avro-plugin:0.20.0
|    \--- org.apache.avro:avro-compiler:1.9.2    <<<<<<<<<<<<<<<<<<<<<<<<<<
|         +--- org.apache.avro:avro:1.9.2
|         |    +--- com.fasterxml.jackson.core:jackson-core:2.10.2 -> 2.11.0
|         |    +--- com.fasterxml.jackson.core:jackson-databind:2.10.2 -> 2.11.0
|         |    |    +--- com.fasterxml.jackson.core:jackson-annotations:2.11.0
|         |    |    \--- com.fasterxml.jackson.core:jackson-core:2.11.0
|         |    +--- org.apache.commons:commons-compress:1.19
|         |    \--- org.slf4j:slf4j-api:1.7.25 -> 1.7.30
|         +--- org.apache.commons:commons-lang3:3.9 -> 3.10
|         +--- org.apache.velocity:velocity-engine-core:2.2
|         |    +--- org.apache.commons:commons-lang3:3.9 -> 3.10
|         |    \--- org.slf4j:slf4j-api:1.7.30
|         +--- com.fasterxml.jackson.core:jackson-databind:2.10.2 -> 2.11.0 (*)
|         +--- joda-time:joda-time:2.10.1
|         \--- org.slf4j:slf4j-api:1.7.25 -> 1.7.30

我不确定是应该编辑生成的POJO类,还是遗漏了什么。

我能够将avro消息转换为POJO通过改变模式,如下所述的问题。但是我认为这很奇怪,问题还没有解决。

问题-Avro无法使用字段中的逻辑类型反序列化Union


共有1个答案

樊浩初
2023-03-14

请您澄清一下您使用的avro-maven-plugin的哪个版本来生成POJO?从avro版本1.9.0开始,Joda-Time已被弃用,支持Java8 JSR310,Java8被设置为默认值。请参见Apache Avro 1.9.0版本说明。

当我从头开始生成POJO时,我得到了java。时间LocalDate出生日期_DT而非组织。乔达。时间本地出生日期\u DT

   @Deprecated public java.time.LocalDate BIRTH_DT;

所以,在你的例子中,我认为在类路径或过时的pojo中很可能存在avro版本不匹配。我建议通过mvn依赖关系验证avro版本:Tree-D包括=org.apache.avro: avro调用并重新生成POJO。

 类似资料:
  • 主要目标是聚合两个Kafka主题,一个压缩慢速移动数据,另一个每秒接收一次的快速移动数据。 我已经能够在简单的场景中使用消息,例如KV(Long, String),使用如下内容: 但是,当您需要从 AVRO 反序列化时,这似乎不是方法。我有一个KV(字符串,AVRO),我需要消费。 我尝试从AVRO模式生成Java类,然后将它们包含在“应用”中,例如: 但这似乎不是正确的方法。 是否有任何文档/示

  • 我是Avro和Kafka的新手,我花了几天时间来发送关于Kafka主题的序列化数据...不成功。 让我来解释一下我想要达到的目标: 在生产者方面,我通过SOAP接收数据并发送关于Kafka主题的内容。我正在使用CXF从WSDL生成POJO,并且编写了相应的模式。我正在尝试做的是序列化由CXF解封的对象,并在我的Kafka主题上发送它们。 在web上找到的大多数示例中,Avro记录都是使用已知的模式

  • 问题内容: 我试图序列化和反序列化内部对象的数组列表: HairList对象也是一个可序列化的对象。 此代码执行返回以下错误: 排队 我不知道我在做什么错。你能给个小费吗? 更新: 解决: 仅使用HairBirt的本机数组而不是ArrayList即可工作: 代替 感谢大家的帮助。 问题答案: 不要使用-而是使用二进制数据并对它进行base64编码,以将其转换为字符串而不会丢失信息。 我强烈怀疑这是

  • 我使用的是Spring Kafka 1.1.2-Spring Boot 1.5.0 RC版本,并且配置了一个自定义值serialiser/Deserialiser类,扩展/。这些类确实使用Jackson ObjectMapper,它可以通过构造函数提供。 是否可以从Spring上下文中注入ObjectMapper?我已经配置了一个ObjectMapper,我希望在序列化/反序列化程序中重用它。