当前位置: 首页 > 知识库问答 >
问题:

线程“main”java中出现异常。lang.NoClassDefFoundError:org/apache/hadoop/util/Tool

慕云
2023-03-14
I get below error when i package (jar) and run my defaulthadoopjob. 

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/Tool
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.Tool
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    ... 12 more
Could not find the main class: DefaultHadoopJobDriver. Program will exit.


Commands used to build Jar. 

# jar -cvf dhj.jar 
# hadoop -jar dhj.jar DefaultHadoopJobDriver

The above command gave me error "Failed to load Main-Class manifest attribute from dhj.jar"

rebuilt jar with manifest using below command

jar-cvfe-dhj。jar DefaultHadoopJobDriver

我的Hadoop作业只有一个类“DefaultHoopJobDrive”,它扩展了配置和实现工具,并将方法作为作业创建和inputpath、outpurpath集的唯一代码运行。我也在使用新的API。

I'm running hadoop 1.2.1 and the Job works fine from eclipse.

This might be something to do with the classpath. Please help.

共有3个答案

裴昊阳
2023-03-14

如果有人在这里使用Maven并着陆:依赖问题可以通过要求Maven在父项目的jar本身中包含它需要的任何jar来解决。这样,Hadoop就不必在其他地方寻找依赖关系——它可以自己在那里找到它们。以下是如何做到这一点: 1。去pom.xml

>

  • 添加一个节到您的

    将以下内容添加到您的

    <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>1.7.1</version>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                    <configuration>
                        <artifactSet>
                            <excludes>
                                <exclude>org.slf4j:slf4j-api</exclude>
                                <exclude>junit:junit</exclude>
                                <exclude>jmock:jmock</exclude>
                                <exclude>xml-apis:xml-apis</exclude>
                                <exclude>org.testng:testng</exclude>
                                <exclude>org.mortbay.jetty:jetty</exclude>
                                <exclude>org.mortbay.jetty:jetty-util</exclude>
                                <exclude>org.mortbay.jetty:servlet-api-2.5</exclude>
                                <exclude>tomcat:jasper-runtime</exclude>
                                <exclude>tomcat:jasper-compiler</exclude>
                                <exclude>org.apache.hadoop:hadoop-core</exclude>
                                <exclude>org.apache.mahout:mahout-math</exclude>
                                <exclude>commons-logging:commons-logging</exclude>
                                <exclude>org.mortbay.jetty:jsp-api-2.1</exclude>
                                <exclude>org.mortbay.jetty:jsp-2.1</exclude>
                                <exclude>org.eclipse.jdt:core</exclude>
                                <exclude>ant:ant</exclude>
                                <exclude>org.apache.hadoop:avro</exclude>
                                <exclude>jline:jline</exclude>
                                <exclude>log4j:log4j</exclude>
                                <exclude>org.yaml:snakeyaml</exclude>
                                <exclude>javax.ws.rs:jsr311-api</exclude>
                                <exclude>org.slf4j:jcl-over-slf4j</exclude>
                                <exclude>javax.servlet:servlet-api</exclude>
                            </excludes>
                        </artifactSet>
                        <filters>
                            <filter>
                                <artifact>*:*</artifact>
                                <excludes>
                                    <exclude>META-INF/jruby.home</exclude>
                                    <exclude>META-INF/license</exclude>
                                    <exclude>META-INF/maven</exclude>
                                    <exclude>META-INF/services</exclude>
                                </excludes>
                            </filter>
                        </filters>
                    </configuration>
                </execution>
            </executions>
        </plugin>
    

    现在再次构建您的项目,并使用普通的hadoopjavamy运行。罐子 命令。它现在不应该为依赖而哭泣。希望这有帮助!

  • 常自怡
    2023-03-14

    尝试使用hadoop的lib文件夹中所有可用的hadoop JAR构建hadoop java代码。在本例中,您缺少hadoop内核中的hadoop util类-*。罐子

    可以在jar中构建代码时指定类路径,也可以使用以下命令将其外部化

        hadoop -cp <path_containing_hadoop_jars> -jar <jar_name>
    
    杜焕
    2023-03-14

    为了执行这个jar,您不必给出hadoop-jar。命令是这样的:

     hadoop jar <jar> [mainClass] args...
    

    如果这个jar再次获得java.lang.ClassNotFoundExc0019异常,那么您可以使用:

    hadoop类路径

    命令查看是否hadoop-core-1.2。1.jar是否存在于hadoop安装类路径中?

    仅供参考,如果它不在这个列表中,您必须将这个jar添加到hadoop lib目录中。

     类似资料: