当前位置: 首页 > 知识库问答 >
问题:

Flink程序包缺少类检查点提交程序-Flink连接器cassandra-硬错误

汪深
2023-03-14

我在本地环境中向Flink(v1.0.3)提交新作业时遇到此错误。

造成原因:java.lang.NoClassDefFoundError: org/apache/flink/stream/runtime/操作员/Checkpointorg.apache.flink.streaming.connectors.cassandra.CassandraSink.addSink(CassandraSink.java:164)在com.xxx.yyy.sample.backend.flink.AAAAA. main(AAAAA. java: 99)在sun.reflect.NativeMEOAccessorImpl.invoke0(本机方法)sun.reflect.NativeMEDAccessorImpl.invoke(NativeMEDAccessorImpl.java:62)在sun.reflect.委托mpl.invoke(委托mpl.java:43)在java.lang.reflect.Method.invoke(Method.java:498)在org.apache.flink.client.program.PackagedProgram.callMain method(PackagedProgram.java:505)在org.apache.flink.client.program.PackagedProgram. invkeInteractiveModeForExecttion(PackagedProgram. java: 403)在org. apache. flink. client. Program。33更多查看github中的源代码

https://github.com/apache/flink/tree/master/flink-streaming-java/src/main/java/org/apache/flink/streaming/runtime/operators

我认为类应该是可用的,它应该打包在flink-streaming-java.jar但是一旦文件被导入,类就不存在了

缺少检查点提交者

有什么想法吗?

波姆。xml

http://maven.apache.org/xsd/maven-4.0.0.xsd"

<groupId>com.AAAAA.BBB</groupId>
<artifactId>sample-backend-flink</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>

<name>Flink Quickstart Job</name>
<url>http://www.myorganization.org</url>

<properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <flink.version>1.0.3</flink.version>
    <cassandra.version>2.2.7</cassandra.version>
    <driver.version>3.0.0</driver.version>
</properties>

<repositories>
    <repository>
        <id>apache.snapshots</id>
        <name>Apache Development Snapshot Repository</name>
        <url>https://repository.apache.org/content/repositories/snapshots/</url>
        <releases>
            <enabled>true</enabled>
        </releases>
        <snapshots>
            <enabled>true</enabled>
        </snapshots>
    </repository>
</repositories>


<dependencies>
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-java</artifactId>
        <version>${flink.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-streaming-java_2.10</artifactId>
        <version>${flink.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-clients_2.10</artifactId>
        <version>${flink.version}</version>
    </dependency>

    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-core</artifactId>
        <version>${flink.version}</version>
    </dependency>

    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-connector-kafka-0.8_2.10</artifactId>
        <version>1.0.2</version>
    </dependency>
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-connector-cassandra_2.10</artifactId>
        <version>1.0.2</version>
    </dependency>
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-connector-redis_2.10</artifactId>
        <version>1.1-SNAPSHOT</version>
    </dependency>
</dependencies>

<profiles>
    <profile>
        <!-- Profile for packaging correct JAR files -->
        <id>build-jar</id>
        <activation>
            <activeByDefault>false</activeByDefault>
        </activation>
        <dependencies>
            <dependency>
                <groupId>org.apache.flink</groupId>
                <artifactId>flink-java</artifactId>
                <version>${flink.version}</version>
                <scope>provided</scope>
            </dependency>
            <dependency>
                <groupId>org.apache.flink</groupId>
                <artifactId>flink-streaming-java_2.10</artifactId>
                <version>${flink.version}</version>
                <scope>provided</scope>
            </dependency>
            <dependency>
                <groupId>org.apache.flink</groupId>
                <artifactId>flink-streaming-scala_2.10</artifactId>
                <version>${flink.version}</version>
                <scope>provided</scope>
            </dependency>
            <dependency>
                <groupId>org.apache.flink</groupId>
                <artifactId>flink-clients_2.10</artifactId>
                <version>${flink.version}</version>
                <scope>provided</scope>
            </dependency>
            <dependency>
                <groupId>org.apache.flink</groupId>
                <artifactId>flink-core</artifactId>
                <version>${flink.version}</version>
            </dependency>
            <dependency>
                <groupId>org.apache.flink</groupId>
                <artifactId>flink-connector-kafka-0.8_2.10</artifactId>
                <version>1.0.2</version>
            </dependency>
            <dependency>
                <groupId>org.apache.flink</groupId>
                <artifactId>flink-connector-cassandra_2.10</artifactId>
                <version>1.1-SNAPSHOT</version>
            </dependency>
            <dependency>
                <groupId>org.apache.flink</groupId>
                <artifactId>flink-connector-redis_2.10</artifactId>
                <version>1.1-SNAPSHOT</version>
            </dependency>
        </dependencies>

        <build>
            <plugins>
                <!-- disable the exclusion rules -->
                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-shade-plugin</artifactId>
                    <version>2.4.1</version>
                    <executions>
                        <execution>
                            <phase>package</phase>
                            <goals>
                                <goal>shade</goal>
                            </goals>
                            <configuration>
                                <artifactSet>
                                    <excludes combine.self="override"></excludes>
                                </artifactSet>
                            </configuration>
                        </execution>
                    </executions>
                </plugin>
            </plugins>
        </build>
    </profile>
</profiles>

<build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>2.4.1</version>
            <executions>
                <!-- Run shade goal on package phase -->
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                    <configuration>
                        <artifactSet>
                            <excludes><exclude>org.apache.flink:flink-annotations</exclude>
                                <exclude>org.apache.flink:flink-shaded-hadoop1</exclude>
                                <exclude>org.apache.flink:flink-shaded-hadoop2</exclude>
                                <exclude>org.apache.flink:flink-shaded-curator-recipes</exclude>
                                <exclude>org.apache.flink:flink-core</exclude>
                                <exclude>org.apache.flink:flink-java</exclude>
                                <exclude>org.apache.flink:flink-scala_2.10</exclude>
                                <exclude>org.apache.flink:flink-runtime_2.10</exclude>
                                <exclude>org.apache.flink:flink-optimizer_2.10</exclude>
                                <exclude>org.apache.flink:flink-clients_2.10</exclude>
                                <exclude>org.apache.flink:flink-avro_2.10</exclude>
                                <exclude>org.apache.flink:flink-examples-batch_2.10</exclude>
                                <exclude>org.apache.flink:flink-examples-streaming_2.10</exclude>
                                <exclude>org.apache.flink:flink-streaming-java_2.10</exclude>


                                <exclude>org.scala-lang:scala-library</exclude>
                                <exclude>org.scala-lang:scala-compiler</exclude>
                                <exclude>org.scala-lang:scala-reflect</exclude>
                                <exclude>com.amazonaws:aws-java-sdk</exclude>
                                <exclude>com.typesafe.akka:akka-actor_*</exclude>
                                <exclude>com.typesafe.akka:akka-remote_*</exclude>
                                <exclude>com.typesafe.akka:akka-slf4j_*</exclude>
                                <exclude>io.netty:netty-all</exclude>
                                <exclude>io.netty:netty</exclude>
                                <exclude>commons-fileupload:commons-fileupload</exclude>
                                <exclude>org.apache.avro:avro</exclude>
                                <exclude>commons-collections:commons-collections</exclude>
                                <exclude>org.codehaus.jackson:jackson-core-asl</exclude>
                                <exclude>org.codehaus.jackson:jackson-mapper-asl</exclude>
                                <exclude>com.thoughtworks.paranamer:paranamer</exclude>
                                <exclude>org.xerial.snappy:snappy-java</exclude>
                                <exclude>org.apache.commons:commons-compress</exclude>
                                <exclude>org.tukaani:xz</exclude>
                                <exclude>com.esotericsoftware.kryo:kryo</exclude>
                                <exclude>com.esotericsoftware.minlog:minlog</exclude>
                                <exclude>org.objenesis:objenesis</exclude>
                                <exclude>com.twitter:chill_*</exclude>
                                <exclude>com.twitter:chill-java</exclude>
                                <exclude>com.twitter:chill-avro_*</exclude>
                                <exclude>com.twitter:chill-bijection_*</exclude>
                                <exclude>com.twitter:bijection-core_*</exclude>
                                <exclude>com.twitter:bijection-avro_*</exclude>
                                <exclude>commons-lang:commons-lang</exclude>
                                <exclude>junit:junit</exclude>
                                <exclude>de.javakaffee:kryo-serializers</exclude>
                                <exclude>joda-time:joda-time</exclude>
                                <exclude>org.apache.commons:commons-lang3</exclude>
                                <exclude>org.slf4j:slf4j-api</exclude>
                                <exclude>org.slf4j:slf4j-log4j12</exclude>
                                <exclude>log4j:log4j</exclude>
                                <exclude>org.apache.commons:commons-math</exclude>
                                <exclude>org.apache.sling:org.apache.sling.commons.json</exclude>
                                <exclude>commons-logging:commons-logging</exclude>
                                <exclude>commons-codec:commons-codec</exclude>
                                <exclude>com.fasterxml.jackson.core:jackson-core</exclude>
                                <exclude>com.fasterxml.jackson.core:jackson-databind</exclude>
                                <exclude>com.fasterxml.jackson.core:jackson-annotations</exclude>
                                <exclude>stax:stax-api</exclude>
                                <exclude>com.typesafe:config</exclude>
                                <exclude>org.uncommons.maths:uncommons-maths</exclude>
                                <exclude>com.github.scopt:scopt_*</exclude>
                                <exclude>commons-io:commons-io</exclude>
                                <exclude>commons-cli:commons-cli</exclude>
                            </excludes>
                        </artifactSet>
                        <filters>
                            <filter>
                                <artifact>org.apache.flink:*</artifact>
                                <excludes>
                                    <!-- exclude shaded google but include shaded curator -->
                                    <exclude>org/apache/flink/shaded/com/**</exclude>
                                    <exclude>web-docs/**</exclude>
                                </excludes>
                            </filter>
                            <filter>
                                <!-- Do not copy the signatures in the META-INF folder.
                                Otherwise, this might cause SecurityExceptions when using the JAR. -->
                                <artifact>*:*</artifact>
                                <excludes>
                                    <exclude>META-INF/*.SF</exclude>
                                    <exclude>META-INF/*.DSA</exclude>
                                    <exclude>META-INF/*.RSA</exclude>
                                </excludes>
                            </filter>
                        </filters>
                        <transformers>
                            <!-- add Main-Class to manifest file -->
                            <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                <mainClass>com.demandware.unified.sample.backend.flink.PhysicalInventory</mainClass>
                            </transformer>
                        </transformers>
                        <createDependencyReducedPom>false</createDependencyReducedPom>
                    </configuration>
                </execution>
            </executions>
        </plugin>

        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>3.1</version>
            <configuration>
                <source>1.8</source> <!-- If you want to use Java 8, change this to "1.8" -->
                <target>1.8</target> <!-- If you want to use Java 8, change this to "1.8" -->
            </configuration>
        </plugin>
    </plugins>
</build>

共有1个答案

贺兴昌
2023-03-14

我也有类似的问题。对我来说,修复的是确保源中使用的Kafka消费者/生产者与您所依赖的相匹配。

FlinkKafkaConsumer的不同版本之间似乎存在冲突。

使用此导入时

<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-connector-kafka-0.8_2.10</artifactId>
    <version>1.0.2</version>
</dependency>

确保您使用的是Flink Kafka消费者08,而不是Flink Kafka消费者(如果您使用的是生产者,则使用生产者)

对我来说奇怪的是,我的IDE无法解析FlinkKafkaConsumer08,但它使用构建工具编译并正确运行。

 类似资料:
  • 下面是我对Flink的疑问。 我们可以将检查点和保存点存储到RockDB等外部数据结构中吗?还是只是我们可以存储在RockDB等中的状态。 状态后端是否会影响检查点?如果是,以何种方式? 什么是状态处理器API?它与我们存储的保存点和检查点直接相关吗?状态处理器API提供了普通保存点无法提供的哪些额外好处? 对于3个问题,请尽可能描述性地回答。我对学习状态处理器API很感兴趣,但我想深入了解它的应

  • 我使用的是和连接器jar版本为0.10.2,kafka版本为0.9.1,flink版本为1.0.0。 当我在IDE中作为独立的主程序运行Java消费者时,它工作得很好。但是当我从运行它时,我不会看到正在使用的消息,也不会看到中JobManager的stdout中的任何日志。请告诉我可能有什么问题。

  • 我用Flink编写了一个小测试用例代码来对数据流进行排序。代码如下: 然而,代码只输出执行计划和其他几行。但它不会输出实际排序的数字。我做错了什么?

  • 如果Flink应用程序在发生故障或更新后正在启动备份,那么不明确属于KeyedState或OperatorState的类变量是否会持久化? 例如,Flink的留档中描述的BoundedOutOfOrdernessGenerator有一个电流最大时间戳变量。如果更新了Flink应用程序,电流最大时间戳中的值是否会丢失,或者是否会写入在应用程序更新之前创建的保存点? 这样做的真正原因是我想实现一个自定

  • Maven JavaFx项目编译但从控制台运行时给出“缺少JavaFx应用程序类”错误消息 上面的方法是“创建一个新的主类,并调用扩展应用程序的类的主方法”。 然而,我仍然面临以下问题: 2.java--module-path“c:\program files\java\javafx-sdk-11.0.2\lib”--add-modules=javafx.controls,javafx.fxml,

  • 当我启动模拟器时,我收到错误消息,因为 模拟器:PANIC:缺少“arm”CPU的模拟器引擎程序。 模拟器:进程已完成,退出代码为 1 Win 10 , Android Studio3.1.4 , AVD 联结 4 Api21