当前位置: 首页 > 知识库问答 >
问题:

nosuchmethoderror:在yarn集群上进行spark-submit时

邓欣德
2023-03-14

我有一个spark应用程序在本地模式下正确运行。在yarn集群上运行spark-submit时,会出现以下错误:

18/07/26 18:12:38 ERROR ApplicationMaster: User class threw exception: java.lang.NoSuchMethodError: org.apache.http.impl.client.HttpClientBuilder.setSSLContext(Ljavax/net/ssl/SSLContext;)Lorg/apache/http/impl/client/HttpClientBuilder;
java.lang.NoSuchMethodError: org.apache.http.impl.client.HttpClientBuilder.setSSLContext(Ljavax/net/ssl/SSLContext;)Lorg/apache/http/impl/client/HttpClientBuilder;
        at fr.test.ssl.SecrityHttpClient$.getHttpClientWithoutSSL(SecrityHttpClient.scala:23)
        at fr.test.processor.HttpProcessor$.execute(HttpProcessor.scala:36)
        at fr.test.engine.RequestEngine$$anonfun$executeHttpRequest$2.apply(RequestEngine.scala:28)
        at fr.test.engine.RequestEngine$$anonfun$executeHttpRequest$2.apply(RequestEngine.scala:21)
        at scala.collection.immutable.List.foreach(List.scala:318)
        at fr.test.engine.RequestEngine$.executeHttpRequest(RequestEngine.scala:21)
        at fr.test.launcher.Launcher$.executeRequestList(Launcher.scala:20)
        at fr.test.launcher.Launcher$.main(Launcher.scala:10)
        at fr.test.launcher.Launcher.main(Launcher.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:559)
18/07/26 18:12:38 INFO ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.lang.NoSuchMethodError: org.apache.http.impl.client.HttpClientBuilder.setSSLContext(Ljavax/net/ssl/SSLContext;)Lorg/apache/http/impl/client/HttpClientBuilder;)

似乎找不到httpclient依赖项。这是我的构造

import aether.AetherKeys._

name := "my_app"

organization := "fr.test"
version := "0.1"
scalaVersion := "2.10.6"

val httpclientVersion = "4.5.6"
val slickVersion = "3.1.1"
val hikariCPVersion = "2.4.6"

libraryDependencies += "com.google.code.gson" % "gson" % "2.8.5"
libraryDependencies += "com.typesafe.slick" %% "slick-hikaricp" % slickVersion exclude("com.zaxxer", "HikariCP-java6")
libraryDependencies += "com.zaxxer" % "HikariCP" % hikariCPVersion

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.2"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.6.2"
libraryDependencies += "com.springml" %% "spark-sftp" % "1.0.2"
// logging
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.1.7"
libraryDependencies += "com.sndyuk" % "logback-more-appenders" % "1.4.2"

// https://mvnrepository.com/artifact/com.databricks/spark-csv
libraryDependencies += "com.databricks" %% "spark-csv" % "1.5.0"

libraryDependencies += "org.apache.httpcomponents" % "httpclient" % httpclientVersion

libraryDependencies += "org.postgresql" % "postgresql" % "9.4.1208"

aetherOldVersionMethod := true
overridePublishSettings

mainClass in assembly := Some("fr.test.Launcher")
assemblyMergeStrategy in assembly := {
  case PathList("META-INF", xs @ _*) => MergeStrategy.discard
  case "reference.conf" => MergeStrategy.concat
  case x => MergeStrategy.first
}

// Configure assembly artifact to be published
artifact in(Compile, assembly) := {
  val art = (artifact in(Compile, assembly)).value
  art.withClassifier(Some("assembly"))
}
addArtifact(artifact in(Compile, assembly), assembly)
spark-submit --class fr.test.Launcher \
--master yarn-cluster \
--num-executors 4 \
--driver-memory 10g \
--executor-memory 5g \
--queue dlk_dev \
--files /home/my_user/my_app_2.10/-SNAPSHOT/application--SNAPSHOT.conf#app.conf \
--conf "spark.driver.extraJavaOptions=-verbose:class" \
--conf "spark.executor.extraJavaOptions=-verbose:class" \
/home/my_user/my_app_2.10/-SNAPSHOT/my_app_2.10--SNAPSHOT.jar 
[Loaded org.apache.http.impl.client.HttpClientBuilder from file:/data/5/yarn/local/filecache/216/spark-hdp-assembly.jar]

你知道吗?

共有1个答案

祁飞飙
2023-03-14

spark自带HTTP客户端。很可能您使用的http-client版本不同于yarn集群中部署的http-client版本。您可以将spark配置选项spark.executor.UserClasspathFirst设置为true,以便将用户提供的jar(在本例中只是您的uber jar)放在类路径的第一个位置。这应该允许首先获取您的httpclient版本。

 类似资料:
  • 我是一名spark/纱线新手,在提交纱线集群上的spark作业时遇到exitCode=13。当spark作业在本地模式下运行时,一切正常。 我使用的命令是: Spark错误日志:

  • 我正在kerberized集群上运行Spark1.1.0、HDP2.1。我可以使用--master yarn-client成功地运行spark-submit,并且结果被正确地写入HDFS,但是,该工作没有显示在Hadoop All Applications页面上。我想使用--master yarn-cluster运行spark-submit,但仍然会出现以下错误: 我已经为我的帐户提供了对集群的访

  • 配置 大部分为Spark on YARN模式提供的配置与其它部署模式提供的配置相同。下面这些是为Spark on YARN模式提供的配置。 Spark属性 Property Name Default Meaning spark.yarn.applicationMaster.waitTries 10 ApplicationMaster等待Spark master的次数以及SparkContext初始

  • Mesosphere在简化Mesos上运行Spark的过程方面做了很大的工作。我正在使用本指南在Google Cloud Compute上建立一个开发Mesos集群。 https://mesosphere.com/docs/tutorials/run-spark-on-mesos/ 我可以使用运行指南中的示例(查找小于10的数字)。但是,当我试图在本地提交一个与Spark一起正常工作的应用程序时,

  • 我正在尝试从本地机器终端向我的集群提交一个Spark应用程序。我正在使用。我也需要在我的集群上运行驱动程序,而不是在我提交应用程序的机器上,即我的本地机器上 当我提供到本地机器中的应用程序jar的路径时,spark-submit会自动上传到我的集群吗? 我在用 和获取错误

  • 我被困在: 在我得到这个之前: 当我签出应用程序跟踪页面时,我在stderr上得到以下信息: 我对这一切都很陌生,也许我的推理有缺陷,任何投入或建议都会有所帮助。