当前位置: 首页 > 知识库问答 >
问题:

IntelliJ中SBT项目的未解析依赖项路径

宇文弘懿
2023-03-14

我正在使用IntelliJ开发Spark应用程序。我正在遵循这个关于如何使intellij与SBT项目良好地工作的指导。

[info] Resolving org.apache.thrift#libfb303;0.9.2 ...
[info] Resolving org.apache.spark#spark-streaming_2.10;2.1.0 ...
[info] Resolving org.apache.spark#spark-streaming_2.10;2.1.0 ...
[info] Resolving org.apache.spark#spark-parent_2.10;2.1.0 ...
[info] Resolving org.scala-lang#jline;2.10.6 ...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: sparrow-to-orc#sparrow-to-orc_2.10;0.1: not found
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn]  Note: Unresolved dependencies path:
[warn]      sparrow-to-orc:sparrow-to-orc_2.10:0.1
[warn]        +- mainrunner:mainrunner_2.10:0.1-SNAPSHOT
[trace] Stack trace suppressed: run 'last mainRunner/:ssExtractDependencies' for the full output.
[trace] Stack trace suppressed: run 'last mainRunner/:update' for the full output.
[error] (mainRunner/:ssExtractDependencies) sbt.ResolveException: unresolved dependency: sparrow-to-orc#sparrow-to-orc_2.10;0.1: not found
[error] (mainRunner/:update) sbt.ResolveException: unresolved dependency: sparrow-to-orc#sparrow-to-orc_2.10;0.1: not found
[error] Total time: 47 s, completed Jun 10, 2017 8:39:57 AM
name := "sparrow-to-orc"

version := "0.1"

scalaVersion := "2.11.8"

lazy val sparkDependencies = Seq(
  "org.apache.spark" %% "spark-core" % "2.1.0",
  "org.apache.spark" %% "spark-sql" % "2.1.0",
  "org.apache.spark" %% "spark-hive" % "2.1.0",
  "org.apache.spark" %% "spark-streaming" % "2.1.0"
)

libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.7.4"
libraryDependencies += "org.apache.hadoop" % "hadoop-aws" % "2.7.1"

libraryDependencies ++= sparkDependencies.map(_ % "provided")

lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
  libraryDependencies ++= sparkDependencies.map(_ % "compile")
)

assemblyMergeStrategy in assembly := {
  case PathList("org","aopalliance", xs @ _*) => MergeStrategy.last
  case PathList("javax", "inject", xs @ _*) => MergeStrategy.last
  case PathList("javax", "servlet", xs @ _*) => MergeStrategy.last
  case PathList("javax", "activation", xs @ _*) => MergeStrategy.last
  case PathList("org", "apache", xs @ _*) => MergeStrategy.last
  case PathList("com", "google", xs @ _*) => MergeStrategy.last
  case PathList("com", "esotericsoftware", xs @ _*) => MergeStrategy.last
  case PathList("com", "codahale", xs @ _*) => MergeStrategy.last
  case PathList("com", "yammer", xs @ _*) => MergeStrategy.last
  case "about.html" => MergeStrategy.rename
  case "META-INF/ECLIPSEF.RSA" => MergeStrategy.last
  case "META-INF/mailcap" => MergeStrategy.last
  case "META-INF/mimetypes.default" => MergeStrategy.last
  case "plugin.properties" => MergeStrategy.last
  case "log4j.properties" => MergeStrategy.last
  case "overview.html" => MergeStrategy.last
  case x =>
    val oldStrategy = (assemblyMergeStrategy in assembly).value
    oldStrategy(x)
}

run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run))
lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
  libraryDependencies ++= sparkDependencies.map(_ % "compile")
)

但这样我就无法在IntelliJ内部运行应用程序,因为类路径中不会包含spark依赖项。

共有1个答案

富凯旋
2023-03-14

我也有同样的问题。解决方案是将mainrunner中的Scala版本设置为与build.sbt文件顶部声明的版本相同:

lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
    libraryDependencies ++= sparkDependencies.map(_ % "compile"),
    scalaVersion := "2.11.8"
)

祝你好运!

 类似资料:
  • 使用IntelliJ最新的scala插件sbt 13.8和scala 2.11.7的新sbt项目,我尝试添加一个库-akka 2.4.2。在此之后,我简单地添加 [信息]解析org.scala-sbt#testing;0.13.8... [信息]解析org.scala-sbt#test-agent;0.13.8... [信息]解析org.scala-SBT#test-interface;1.0..

  • 导入SBT项目时出错: ... 关于如何解决这个问题的任何建议。导入在另一台机器上运行良好。我看到了.ivy2文件夹中的jar文件。

  • 版本:=“1.0” scalaVersion:=“2.11.8” ivyScala:=ivyScala.Value map{_.copy(overrideScalaVersion=true)} libraryDependencies+=“org.apache.spark”%%“Spark-Core”%“2.1.0” 当我尝试用sbt组装jar时,我试图将spark引入我的开发环境,但它失败了,并且

  • 我有一个多项目的构建设置与gradle将包含多个android应用程序和库。 我想应用某些分级插件只对一些子项目。(比如android gradle插件只到android子项目)因此我在和插件声明中添加了类路径依赖项到两个android子项目: 和 。问题是gradle找不到android gradle插件: 错误:(%1,%1)评估项目“:ShopPr:Presentation”时出现问题。找不

  • build.sbt Build.Properties 项目/下的Assembly.sbt

  • 关于解决依赖关系,我与sbt有一些问题。 built.sbt Sbt版本:来自arch community repository的Sbt 1.0.0-2 运行操作系统:Antergos内核4.12.8-2-arch