当前位置: 首页 > 知识库问答 >
问题:

为什么spark-submit使用错误的java版本

孔欣荣
2023-03-14

我使用bin/spark-ec2脚本设置了一个spark ec2集群。当我ssh到主节点并在一个示例程序上运行spark-submit时,我看到来自所有执行器的以下错误,每个执行器都标记为failed:

(java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/4"): error=2, No such file or directory)

奇怪的是spark为什么要寻找Java-1.7.0-OpenJDK-1.7.0.85.x86_64。我将JAVA_HOME设置为:/usr/lib/jvm/jre-1.8.0-openjdk。我甚至递归地grep了OpenJDK-1.7.0.85,但一无所获。那么,为什么spark-submit试图使用一个看似随机的Java版本,它甚至没有安装在主服务器和从服务器上?

完整的输出如下:

[ec2-user@ip-172-31-35-149 spark]$ sudo ./bin/spark-submit --class org.apache.spark.examples.mllib.LinearRegression lib/spark-examples-1.4.1-hadoop1.0.4.jar  data/mllib/sample_linear_regression_data.txt
15/08/18 18:26:46 INFO spark.SparkContext: Running Spark version 1.4.1
15/08/18 18:26:46 INFO spark.SecurityManager: Changing view acls to: root
15/08/18 18:26:46 INFO spark.SecurityManager: Changing modify acls to: root
15/08/18 18:26:46 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
15/08/18 18:26:47 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/08/18 18:26:47 INFO Remoting: Starting remoting
15/08/18 18:26:47 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@172.31.35.149:35948]
15/08/18 18:26:47 INFO util.Utils: Successfully started service 'sparkDriver' on port 35948.
15/08/18 18:26:47 INFO spark.SparkEnv: Registering MapOutputTracker
15/08/18 18:26:47 INFO spark.SparkEnv: Registering BlockManagerMaster
15/08/18 18:26:47 INFO storage.DiskBlockManager: Created local directory at /mnt/spark/spark-7d250ff1-595c-4217-98e1-e8be34868789/blockmgr-24b5de9d-8496-44e8-8806-b091ece651f0
15/08/18 18:26:47 INFO storage.DiskBlockManager: Created local directory at /mnt2/spark/spark-350c9f91-21f0-49f6-a1c1-bc32befdb3e8/blockmgr-b9fc6955-b7c4-46aa-8309-23d51082ef51
15/08/18 18:26:47 INFO storage.MemoryStore: MemoryStore started with capacity 265.1 MB
15/08/18 18:26:47 INFO spark.HttpFileServer: HTTP File server directory is /mnt/spark/spark-7d250ff1-595c-4217-98e1-e8be34868789/httpd-664cbb8f-3599-49b3-8f83-fceb00b5ca7e
15/08/18 18:26:47 INFO spark.HttpServer: Starting HTTP Server
15/08/18 18:26:47 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/08/18 18:26:47 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:43864
15/08/18 18:26:47 INFO util.Utils: Successfully started service 'HTTP file server' on port 43864.
15/08/18 18:26:47 INFO spark.SparkEnv: Registering OutputCommitCoordinator
15/08/18 18:26:48 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/08/18 18:26:48 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
15/08/18 18:26:48 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
15/08/18 18:26:48 INFO ui.SparkUI: Started SparkUI at http://ec2-54-187-197-56.us-west-2.compute.amazonaws.com:4040
15/08/18 18:26:48 INFO spark.SparkContext: Added JAR file:/root/spark/lib/spark-examples-1.4.1-hadoop1.0.4.jar at http://172.31.35.149:43864/jars/spark-examples-1.4.1-hadoop1.0.4.jar with timestamp 1439922408595
15/08/18 18:26:48 INFO client.AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@ec2-54-187-197-56.us-west-2.compute.amazonaws.com:7077/user/Master...
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20150818182649-0015
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/0 on worker-20150812175308-172.31.33.91-56792 (172.31.33.91:56792) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/0 on hostPort 172.31.33.91:56792 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/1 on worker-20150812175308-172.31.33.90-48856 (172.31.33.90:48856) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/1 on hostPort 172.31.33.90:48856 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/0 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/1 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/0 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/0"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/0 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/0"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 0
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/2 on worker-20150812175308-172.31.33.91-56792 (172.31.33.91:56792) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/2 on hostPort 172.31.33.91:56792 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/1 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/1"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/1 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/1"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 1
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/3 on worker-20150812175308-172.31.33.90-48856 (172.31.33.90:48856) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/3 on hostPort 172.31.33.90:48856 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/2 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/3 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/2 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/2"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/2 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/2"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 2
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/4 on worker-20150812175308-172.31.33.91-56792 (172.31.33.91:56792) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/4 on hostPort 172.31.33.91:56792 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/3 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/3"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/3 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/3"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 3
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/5 on worker-20150812175308-172.31.33.90-48856 (172.31.33.90:48856) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/5 on hostPort 172.31.33.90:48856 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/4 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/4 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/4"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/4 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/4"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 4
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/6 on worker-20150812175308-172.31.33.91-56792 (172.31.33.91:56792) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/6 on hostPort 172.31.33.91:56792 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/5 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/5 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/5"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/5 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/5"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 5
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/7 on worker-20150812175308-172.31.33.90-48856 (172.31.33.90:48856) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/7 on hostPort 172.31.33.90:48856 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/6 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/6 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/6"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/6 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/6"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 6
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/8 on worker-20150812175308-172.31.33.91-56792 (172.31.33.91:56792) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/8 on hostPort 172.31.33.91:56792 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/7 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/8 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/7 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/7"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/7 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/7"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 7
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/9 on worker-20150812175308-172.31.33.90-48856 (172.31.33.90:48856) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/9 on hostPort 172.31.33.90:48856 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/8 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/8"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/8 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/8"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 8
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/10 on worker-20150812175308-172.31.33.91-56792 (172.31.33.91:56792) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/10 on hostPort 172.31.33.91:56792 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/9 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/9 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/9"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/9 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/9"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 9
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Application has been killed. Reason: Master removed our application: FAILED
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
15/08/18 18:26:49 INFO ui.SparkUI: Stopped Spark web UI at http://ec2-54-187-197-56.us-west-2.compute.amazonaws.com:4040
15/08/18 18:26:49 INFO scheduler.DAGScheduler: Stopping DAGScheduler
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Shutting down all executors
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Asking each executor to shut down
15/08/18 18:26:49 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 34521.
15/08/18 18:26:49 INFO netty.NettyBlockTransferService: Server created on 34521
15/08/18 18:26:49 INFO storage.BlockManagerMaster: Trying to register BlockManager
15/08/18 18:26:49 INFO storage.BlockManagerMasterEndpoint: Registering block manager 172.31.35.149:34521 with 265.1 MB RAM, BlockManagerId(driver, 172.31.35.149, 34521)
15/08/18 18:26:49 INFO storage.BlockManagerMaster: Registered BlockManager
15/08/18 18:26:49 ERROR spark.SparkContext: Error initializing SparkContext.
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext
    at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:103)
    at org.apache.spark.SparkContext.getSchedulingMode(SparkContext.scala:1503)
    at org.apache.spark.SparkContext.postEnvironmentUpdate(SparkContext.scala:2007)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:543)
    at org.apache.spark.examples.mllib.LinearRegression$.run(LinearRegression.scala:92)
    at org.apache.spark.examples.mllib.LinearRegression$$anonfun$main$1.apply(LinearRegression.scala:84)
    at org.apache.spark.examples.mllib.LinearRegression$$anonfun$main$1.apply(LinearRegression.scala:83)
    at scala.Option.map(Option.scala:145)
    at org.apache.spark.examples.mllib.LinearRegression$.main(LinearRegression.scala:83)
    at org.apache.spark.examples.mllib.LinearRegression.main(LinearRegression.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/08/18 18:26:49 INFO spark.SparkContext: SparkContext already stopped.
Exception in thread "main" java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext
    at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:103)
    at org.apache.spark.SparkContext.getSchedulingMode(SparkContext.scala:1503)
    at org.apache.spark.SparkContext.postEnvironmentUpdate(SparkContext.scala:2007)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:543)
    at org.apache.spark.examples.mllib.LinearRegression$.run(LinearRegression.scala:92)
    at org.apache.spark.examples.mllib.LinearRegression$$anonfun$main$1.apply(LinearRegression.scala:84)
    at org.apache.spark.examples.mllib.LinearRegression$$anonfun$main$1.apply(LinearRegression.scala:83)
    at scala.Option.map(Option.scala:145)
    at org.apache.spark.examples.mllib.LinearRegression$.main(LinearRegression.scala:83)
    at org.apache.spark.examples.mllib.LinearRegression.main(LinearRegression.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/08/18 18:26:49 INFO storage.DiskBlockManager: Shutdown hook called
15/08/18 18:26:49 INFO util.Utils: path = /mnt/spark/spark-7d250ff1-595c-4217-98e1-e8be34868789/blockmgr-24b5de9d-8496-44e8-8806-b091ece651f0, already present as root for deletion.
15/08/18 18:26:49 INFO util.Utils: path = /mnt2/spark/spark-350c9f91-21f0-49f6-a1c1-bc32befdb3e8/blockmgr-b9fc6955-b7c4-46aa-8309-23d51082ef51, already present as root for deletion.
15/08/18 18:26:49 INFO util.Utils: Shutdown hook called
15/08/18 18:26:49 INFO util.Utils: Deleting directory /mnt/spark/spark-7d250ff1-595c-4217-98e1-e8be34868789/httpd-664cbb8f-3599-49b3-8f83-fceb00b5ca7e
15/08/18 18:26:49 INFO util.Utils: Deleting directory /mnt2/spark/spark-350c9f91-21f0-49f6-a1c1-bc32befdb3e8
15/08/18 18:26:49 INFO util.Utils: Deleting directory /mnt/spark/spark-7d250ff1-595c-4217-98e1-e8be34868789

共有1个答案

严兴言
2023-03-14

我将Java从java-1.7.0-openjdk-1.7.0.85.x86_64升级到1.8。我忘了弹跳我的火花工人。因此,工人们从启动时就有了路径,这是从升级前到1.8。

 类似资料:
  • 尝试在安装了Java版本JDK1.7的远程机器上运行我使用maven和JDK1.8构建的spark应用程序。使用spark-submit命令: 获取以下异常:

  • 我有一个奇怪的错误,我正在尝试写数据到hive,它在spark-shell中运行良好,但是当我使用spark-submit时,它抛出的数据库/表在默认错误中找不到。 下面是我试图在spark-submit中编写的代码,我使用的是Spark2.0.0的自定义构建 16/05/20 09:05:18 INFO sparksqlParser:解析命令:spark_schema.measures_2016

  • 我试图运行火花作业,基本上加载数据在卡桑德拉表。但它也产生了以下错误。

  • 我是Spark的新手。我有一个应用程序,通过调用spark shell来运行每个spark sql查询。因此,它将生成一组如下所示的查询,并调用spark shell命令逐个处理这些查询。 Val Query=spark.sql(""SELECT userid as userid,评级为评级,电影为电影从default.movie表""); 现在我想用spark submit而不是spark sh

  • 问题内容: 我希望能够像这样使用Stream :: flatMap 但是我得到以下编译器错误 Test.java:25:错误:不兼容的类型:无法推断类型变量R listOfStrings.stream()。flatMap(str-> plicate(str))。collect(Collectors.toList()); (参数不匹配; lambda表达式List中的错误返回类型无法转换为Strea

  • scala代码是: 打包以后的jar中有stopwords.dic这个文件 路径也是对的 但是还是报错了: Caused by: java.io.FileNotFoundException: file:/Users/laiyinan/Desktop/cc_core/out/artifacts/cc/cc_core.jar!/stopwords.dic (No such file or direct