---SparkApp
|---simple.sbt
|---src
|---main
|---scala
|--- SimpleApp.scala
name := "Simple Project"
version := "0.13.15"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.1"
hadoop@master:~/Mycode/SparkApp$ sbt package
[warn] Executing in batch mode.
[warn] For better performance, hit [ENTER] to switch to interactive mode, or
[warn] consider launching sbt without any commands, or explicitly passing 'shell'
[info] Loading project definition from /home/hadoop/Mycode/SparkApp/project
[info] Set current project to Simple Project (in build file:/home/hadoop/Mycode/SparkApp/)
[info] Compiling 1 Scala source to /home/hadoop/Mycode/SparkApp/target/scala-2.11/classes...
[error] missing or invalid dependency detected while loading class file 'SparkContext.class'.
[error] Could not access term akka in package <root>,
[error] because it (or its dependencies) are missing. Check your build definition for
[error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
[error] A full rebuild may help if 'SparkContext.class' was compiled against an incompatible version of <root>.
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 2 s, completed May 16, 2017 1:08:53 PM
一些提示可能是问题所在:
------------------第二次编辑----------------
/* SimpleApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
object SimpleApp {
def main(args: Array[String]) {
val logFile = "file:///usr/local/spark/README.md" // Should be some file on your system
val conf = new SparkConf().setAppName("Simple Application")
val sc = new SparkContext(conf)
val logData = sc.textFile(logFile, 2).cache()
val numAs = logData.filter(line => line.contains("a")).count()
val numBs = logData.filter(line => line.contains("b")).count()
println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))
}
}
如果您想开发Spark应用程序,您不必安装Spark。
在您作为Spark开发人员的早期(使用spark-shell
和spark-submit
)时,在本地安装Spark确实会有很大帮助,但强烈建议您不要这样做。
换句话说,您作为Spark包安装的内容与您在开发Spark应用程序时可以使用和想要使用的内容无关。
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.1"
name := "Simple Project"
version := "0.13.15"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.1"
sbt.version = 0.13.15
在执行SBT Package
(在/home/hadoop/mycode/sparkapp
中)时面临的问题是,您的应用程序定义了对Akka的依赖关系,您可以在错误消息中看到:
[info] Set current project to Simple Project (in build file:/home/hadoop/Mycode/SparkApp/)
[info] Compiling 1 Scala source to /home/hadoop/Mycode/SparkApp/target/scala-2.11/classes...
[error] missing or invalid dependency detected while loading class file 'SparkContext.class'.
[error] Could not access term akka in package <root>,
[error] because it (or its dependencies) are missing. Check your build definition for
[error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
[error] A full rebuild may help if 'SparkContext.class' was compiled against an incompatible version of <root>.
[error] one error found
[error] (compile:compileIncremental) Compilation failed
从Spark1.6开始,Akka就不再被Spark使用了,所以我想这个项目是以某种方式引用Akka库的,如果它们是Spark的,它就不应该引用。
很多猜测,我希望我们很快就能搞清楚。
我已经看到人们得到了相同的错误消息(error:加载CharSequence时出错,类文件'...\rt.jar(java/lang/CharSequence.class)'被破坏(byte 1470处的坏常量池标记15),最常见的修复方法是降级或升级java/scala/sbt版本。 例如,一个答案是 当前版本:
我正在尝试用IntelliJ IDEA打开Apache Spark源代码。 我在Spark源代码根目录上打开了pom.xml。 项目树显示在项目工具窗口中。 但是,当我打开一个源文件时,比如org.apache.spark.deploy.yarn.ClientBase.scala,编辑器滚动条上会显示很多红色标记。这是“无法解决符号”错误。即使它也无法解决StringOps.format. 我怎么
我有相当多的麻烦指向SBT中Scala源文件的自定义目录。 我希望sbt从给定的目录编译scala文件,而不是常规的目录。 我尝试了定义. sbt和. scala项目文件,设置,(和s中的. scala文件)。我还玩弄了从系统绝对路径到相对路径的一切,但似乎没有什么工作。它无法找到指定目录下的任何. scala文件。 正确的处理方法是什么?
从SBT到Scala-IDE的路径在很多地方都有很好的描述: 从SBT项目开始 添加SBT插件定义:addSbtPlugin(“com.typesafe.sbteclipse”%“sbteclipse-plugin”%“2.1.0-rc1”) 从SBT内运行'eclipse'命令 使用已安装的Scala-IDE加载项打开Eclipse 导入项目 什么是反向的?如果我在Scala-IDE中启动了一个
问题内容: 嗨,我是一名初学者应用程序开发人员,我选择了phonegap,请在将android作为我的项目平台添加时查看问题。代码如下: 更新:谢谢,我认为它向前移动了一点,但现在它显示了这一点: 问题答案: 请在您的PATH变量中添加 C:\ Windows \ System32