当前位置: 首页 > 知识库问答 >
问题:

spark提交“服务”驱动程序“无法绑定到端口”错误

唐宇定
2023-03-14
time spark-submit --deploy-mode cluster --master spark://192.168.0.7:6066 --class org.apache.spark.examples.JavaWordCount /home/pi/Desktop/example/new/target/javaword.jar /books_50.txt 
Running Spark using the REST application submission protocol.
16/07/18 03:55:41 INFO rest.RestSubmissionClient: Submitting a request to launch an application in spark://192.168.0.7:6066.
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Submission successfully created as driver-20160718035543-0000. Polling submission state...
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Submitting a request for the status of submission driver-20160718035543-0000 in spark://192.168.0.7:6066.
16/07/18 03:55:44 INFO rest.RestSubmissionClient: State of driver driver-20160718035543-0000 is now RUNNING.
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Driver is running on worker worker-20160718041005-192.168.0.12-42405 at 192.168.0.12:42405.
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Server responded with CreateSubmissionResponse:
{
  "action" : "CreateSubmissionResponse",
  "message" : "Driver successfully submitted as driver-20160718035543-0000",
  "serverSparkVersion" : "1.6.2",
  "submissionId" : "driver-20160718035543-0000",
  "success" : true
}
Launch Command: "/usr/lib/jvm/jdk-8-oracle-arm32-vfp-hflt/jre/bin/java" "-cp" "/opt/spark/conf/:/opt/spark/lib/spark-assembly-1.6.2-hadoop2.6.0.jar:/opt/spark/lib/datanucleus-api-jdo-3.2.6.jar:/opt/spark/lib/datanucleus-core-3.2.10.jar:/opt/spark/lib/datanucleus-rdbms-3.2.9.jar" "-Xms1024M" "-Xmx1024M" "-Dspark.driver.supervise=false" "-Dspark.app.name=org.apache.spark.examples.JavaWordCount" "-Dspark.submit.deployMode=cluster" "-Dspark.jars=file:/home/pi/Desktop/example/new/target/javaword.jar" "-Dspark.master=spark://192.168.0.7:7077" "-Dspark.executor.memory=10M" "org.apache.spark.deploy.worker.DriverWrapper" "spark://Worker@192.168.0.12:42405" "/opt/spark/work/driver-20160718035543-0000/javaword.jar" "org.apache.spark.examples.JavaWordCount" "/books_50.txt"
========================================

log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/07/18 04:10:58 INFO SecurityManager: Changing view acls to: pi
16/07/18 04:10:58 INFO SecurityManager: Changing modify acls to: pi
16/07/18 04:10:58 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(pi); users with modify permissions: Set(pi)
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
Exception in thread "main" java.net.BindException: Cannot assign requested address: Service 'Driver' failed after 16 retries! Consider explicitly setting the appropriate port for the service 'Driver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:433)
    at sun.nio.ch.Net.bind(Net.java:425)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    at java.lang.Thread.run(Thread.java:745)
export SPARK_MASTER_WEBUI_PORT="8080"
export SPARK_MASTER_IP="192.168.0.7"
export SPARK_EXECUTOR_MEMORY="10M"
export SPARK_WORKER_WEBUI_PORT="8080"
export SPARK_MASTER_IP="192.168.0.7"
export SPARK_EXECUTOR_MEMORY="10M"

请帮忙...!!

共有1个答案

袁赞
2023-03-14

在尝试运行shell时,我也遇到了同样的问题,通过设置SPARK_LOCAL_IP环境变量,我可以实现这一点。您可以在运行shell时从命令行分配:

spark_local_ip=127.0.0.1./bin/spark-shell

要获得更持久的解决方案,请在Spark根目录的conf目录中创建一个spark-env.sh文件。添加以下一行:

 类似资料:
  • 我有一个迷你侦听器/守护程序来捕获SNMP陷阱。这将在 tomcat7 上运行主项目(Spring网站)时执行侦听器。但我总是得到错误 java.net.绑定异常: 权限被拒绝 我尝试使用Authbind(http://java-notes.com/),但我的问题没有解决。我也尝试将端口更改为更大的端口,但我收到错误java.net.BindException:无法分配请求的地址 这是我的应用程序

  • 我想请求帮助一个问题,我在没有接口的windows服务器上运行selenium,我得到以下错误: 无法在http://localhost:49906/at openqa.selenium.driverservice.start()at openqa.selenium.remote.driverserviceCommandExecute(命令commandToExecute)at openqa.se

  • 我使用的是Java8、Spark 2.1.1、Ignite2.5和BoneCP 0.8.0 结果出现以下异常: 提交脚本如下所示: 当使用“本地”Spark实例时,它使用think JDBC驱动程序连接到Ignite。有什么想法吗?

  • spark-defaults.conf中没有任何内容,以编程方式初始化spark上下文的代码是: 在所有这些之后,Spark UI的Environment选项卡的Spark.driver.maxResultSize为10G,Spark.driver.memory为20G,但是驱动程序的存储内存的executors选项卡显示为0.0B/4.3GB。 (请注意:我以前的Spark.Driver.Mem

  • [eclipse-jetty]无法在端口9,091上启动jetty服务器:无法分配请求的地址 我正在使用jetty HTTP服务器发布REST Web服务。 配置: 服务器-ip : 20.121.21.22 端口: 8081 对于上述配置,它抛出以下异常。 org.apache.cxf.service.factory.ServiceConstructionException 原因:org . A

  • 我试图在Hyperledger fabric测试网络(Fabcar javascript智能合约)上执行智能合约,当我试图使用invoke调用链码时,会出现以下错误。fabcar javascript示例中的js文件: 错误:[ServiceEndpoint]:错误:未能在截止日期之前连接到委员会-名称:_buildOrderer,url:_buildPeer,连接:false,连接尝试:true