当前位置: 首页 > 知识库问答 >
问题:

使用maven构建spark 1.2时,com包出现错误。谷歌。常见的

章涵容
2023-03-14

CentOS
6.2
Hadoop
2.6.0
scala
2.10.5
java版本
“1.7.0_75”OpenJDK运行时环境(rhel-2.5.4.0.el6_6-x86_64 u75-b13)
OpenJDK 64位服务器虚拟机(构建24.75-b04,混合模式)
mvn版本
ApacheMaveen 3.3.3.1(CAB6659F9874FA96462AFEFF40F40FC33C;2015-C11301:T213)Maven
Java版本:1.7.0_75,供应商:Oracle公司
Java主页:/usr/lib/jvm/Java-1.7.0-openjdk-1.7.0.75。x86_64/jre
默认区域设置:en_US,平台编码:UTF-8
操作系统名称:“linux”,版本:“2.6.32-220.el6.x86_64”,arch:“amd64”,系列:“unix”
环境变量

export SCALA_HOME=/opt/scala
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.75.x86_64
export JRE_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.75.x86_64/jre
export HADOOP_HOME=/home/tom/hadoop
export SPARK_HOME=/home/tom/spark
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib
export PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin:$HADOOP_HOME/sbin:$HADOOP_HOME/bin:$SPARK_HOME/bin:$MAVEN_HOME/bin:$SCALA_HOME/bin
export MAVEN_HOME=/opt/maven

export SPARK_EXAMPLES_JAR=$SPARK_HOME/spark-0.7.2/examples/target/scala-2.9.3/spark-examples_2.9.3-0.7.2.jar
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export YARN_CONF_DIR=$HADOOP_HOME/etc/hadoop
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/"

构建命令
mvn-Pyarn-Phadoop-2.4-Dhadoop。版本=2.6.0-Phive-Phive-0.12.0-Phive thriftserver-DskipTests干净包
错误消息

[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeBatchFetcher.scala:22: object Throwables is not a member of package com.google.common.base
[ERROR] import com.google.common.base.Throwables
[ERROR]        ^
[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeBatchFetcher.scala:59: not found: value Throwables
[ERROR]           Throwables.getRootCause(e) match {
[ERROR]           ^
[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePollingInputDStream.scala:26: object util is not a member of package com.google.common
[ERROR] import com.google.common.util.concurrent.ThreadFactoryBuilder
[ERROR]                          ^
[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePollingInputDStream.scala:69: not found: type ThreadFactoryBuilder
[ERROR]     Executors.newCachedThreadPool(new ThreadFactoryBuilder().setDaemon(true).
[ERROR]                                       ^
[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePollingInputDStream.scala:76: not found: type ThreadFactoryBuilder
[ERROR]     new ThreadFactoryBuilder().setDaemon(true).setNameFormat("Flume Receiver Thread - %d").build())
[ERROR]         ^
[ERROR] 5 errors found


[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Parent POM ........................... SUCCESS [ 10.121 s]
[INFO] Spark Project Networking ........................... SUCCESS [ 14.957 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 10.858 s]
[INFO] Spark Project Core ................................. SUCCESS [07:33 min]
[INFO] Spark Project Bagel ................................ SUCCESS [ 52.312 s]
[INFO] Spark Project GraphX ............................... SUCCESS [02:19 min]
[INFO] Spark Project Streaming ............................ SUCCESS [03:28 min]
[INFO] Spark Project Catalyst ............................. SUCCESS [03:18 min]
[INFO] Spark Project SQL .................................. SUCCESS [03:48 min]
[INFO] Spark Project ML Library ........................... SUCCESS [03:40 min]
[INFO] Spark Project Tools ................................ SUCCESS [ 29.380 s]
[INFO] Spark Project Hive ................................. SUCCESS [02:53 min]
[INFO] Spark Project REPL ................................. SUCCESS [01:32 min]
[INFO] Spark Project YARN Parent POM ...................... SUCCESS [  5.124 s]
[INFO] Spark Project YARN Stable API ...................... SUCCESS [01:34 min]
[INFO] Spark Project Hive Thrift Server ................... SUCCESS [ 56.404 s]
[INFO] Spark Project Assembly ............................. SUCCESS [01:11 min]
[INFO] Spark Project External Twitter ..................... SUCCESS [ 36.661 s]
[INFO] Spark Project External Flume Sink .................. SUCCESS [ 50.006 s]
[INFO] Spark Project External Flume ....................... FAILURE [ 14.287 s]
[INFO] Spark Project External MQTT ........................ SKIPPED
[INFO] Spark Project External ZeroMQ ...................... SKIPPED
[INFO] Spark Project External Kafka ....................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project YARN Shuffle Service ................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 36:02 min
[INFO] Finished at: 2015-04-04T03:58:19+02:00
[INFO] Final Memory: 60M/330M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on project spark-streaming-flume_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed. CompileFailed -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:




我怀疑这是一些依赖性问题,但我无法理解。有人能帮我吗?

共有3个答案

姚培
2023-03-14

我今天也有类似的问题。本Spark项目外部水槽。。。。。。。。。。。。。。。。。。。。。。。失败log让我恼火,但我认为是git clean-xdf起了作用。如果还不够,还可以尝试git clean-Xdf。运行mvn 再次。祝你好运

卓学智
2023-03-14

如果你可以作弊,那么你可以跳过编译失败的模块

spark-streaming-flume_2.10和spark-streaming-kafka_2.10

以下命令用于编译Spark软件包,该软件包具有对带有CDH5的Spark SQL的配置单元支持。3.3和Spark 1.2.0。

mvn-Pyarn-Dhadoop。版本=2.5.0-cdh5。3.3-DskipTests-Phtml" target="_blank">hive-Phive-thriftserver-pl'!组织。阿帕奇。spark:spark-streaming-flume_2.10,!组织。阿帕奇。spark:spark-streaming-kafka_2.10'套装

苍恩
2023-03-14

在使用下面的命令进行构建时,我遇到了与ApacheSpark1.2.1相同的问题-

mvn -e -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver -DskipTests clean package

ApacheMaven的版本似乎在这里发挥了作用。在失败的情况下,Maven版本是-

Apache Maven **3.3.3** (7994120775791599e205a5524ec3e0dfe41d4a06; 2015-04-22T07:57:37-04:00)
Maven home: /opt/Spark/amit/apache-maven/apache-maven-3.3.3
Java version: 1.8.0, vendor: IBM Corporation
Java home: /opt/Spark/amit/ibmjava8sdk/sdk/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "3.14.8-200.fc20.x86_64", arch: "amd64", family: "unix"

当我尝试使用老maven时,构建是成功的。ApacheMaven3.2的使用。X似乎正在解决这个问题。我用过-

Apache Maven **3.2.5** (12a6b3acb947671f09b81f49094c53f426d8cea1; 2014-12-14T12:29:23-05:00)
Maven home: /opt/Spark/amit/apache-maven/apache-maven-3.2.5
Java version: 1.8.0, vendor: IBM Corporation
Java home: /opt/Spark/amit/ibmjava8sdk/sdk/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "3.14.8-200.fc20.x86_64", arch: "amd64", family: "unix"

希望这有帮助。

谢谢,阿米特

 类似资料:
  • 给定java中的以下代码,编译时u有很多错误: 主要的java:1:error:package-com。谷歌。常见的基本不存在导入com。谷歌。常见的基础前提条件^ 主要的java:2:错误:包com。谷歌。常见的collect不存在导入com。谷歌。常见的收集清单^ Main.java: 3:错误:包org.ros.exception不存在导入org.ros.exception.RosRunti

  • 问题内容: 执行时,出现以下错误: 组装WAR时出错:需要webxml属性(如果以更新模式执行,则必须预先存在WEB-INF / web.xml) 我的Web应用程序结构树如下所示: 我的POM文件如下所示: 如何正确解决该问题? 问候 问题答案: 我 强烈 建议使用Maven的标准布局: 将Java源代码放入(并删除元素) 将Web应用程序源放入 删除下的和目录 当然,您可以自定义布局,但这是I

  • 我检查了/m2/repository。但找不到任何相关的东西。

  • 我正在尝试在Windows 7上使用Cmake构建OpenCV。我选择使用Visual Studio 10编译器。 我收到以下错误: C:/程序文件 (x86)/CMake 2.8/共享/生成 2.8/模块/CMakeCXX 中的生成错误信息:37 (get_filename_component): get_filename_component调用不正确的参数数调用堆栈(最近的调用优先):CMak

  • 这是我的build.gradle应用程序文件: 这是我的build.gradle项目文件 我正在使用Firebase消息服务,并且我想在我的应用程序中获取通知。所以我只是检查日志中的任何更新。我的文件是: 编译时,我遇到了错误。如何解决?

  • 我试图为android构建一个离子应用程序,但我遇到了这个错误: 我试图更新科尔多瓦和科尔多瓦-CLI,但没有工作。