当前位置: 首页 > 知识库问答 >
问题:

连接到spark主机失败:InvalidClassException:org.apache.spark.rpc.rpcEndPointRef;本地类不兼容

凌炜
2023-03-14

>

  • 我在Linux机器上安装了Spark。版本为spark-1.6.2-bin-hadoop2.6.tgz.
  • 然后使用。/sbin/start-all.sh
  • 启动Spark
  • 我尝试在Eclipse中运行javaWordCount.java示例。但总是失败。有人能帮我吗?

    例外情况如下:

    16/07/25 12:01:20 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark:// hostname:7077...
    16/07/25 12:01:20 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master hostname:7077
    org.apache.spark.SparkException: Exception thrown in awaitResult
        at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)
        at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)
        at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
        at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
        at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
        at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162)
        at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)
        at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88)
        at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96)
        at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:109)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
    Caused by: java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.RpcEndpointRef; local class incompatible: stream classdesc serialVersionUID = -1223633663228316618, local class serialVersionUID = 18257903091306170
        at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
        at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
        at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
        at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
        at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:109)
        at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:258)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
        at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:310)
        at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:257)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
        at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:256)
        at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:588)
        at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:570)
        at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:149)
        at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:102)
        at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:104)
        at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
        at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
        at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
        at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    
  • 共有1个答案

    谭煜
    2023-03-14

    此问题是由于版本不匹配造成的。我尝试安装hadoop并使用spark-assembly-1.6.2-hadoop2.6.0.jar,它现在工作得很好。

     类似资料:
    • 我试图使用Cassandra Spark连接器将rdd与Cassandra表连接起来: 它在独立模式下工作,但当我在集群模式下执行时,我会得到以下错误: 会发生什么事?

    • 我运行的是Apache Tomcat8.5。启动tomcat后,我在浏览器地址栏中输入地址,但会出现以下错误: 如何解决此错误并使tomcat正常运行?

    • 问题内容: 由于wamp服务器,我试图将我的android应用程序连接到本地主机url,但它不起作用。我的目标是获取json数据并解析这些数据。对于我的测试,我使用的是设备而不是模拟器,并且使用AndroidManifest.xml中的权限: 我的网址看起来像这样: 我试过了 : 但是到目前为止,它从未起作用: 然后我尝试了在互联网上找到的json url测试:http : //headers.j

    • 我想连接Java类文件与SQL server 2012。我已通过SQL server身份验证登录。但我接收到连接错误。 错误:到主机127.0.0.1端口1433的TCP/IP连接失败。错误:“连接拒绝:连接。验证连接属性。确保主机上正在运行SQL Server实例并接受端口上的TCP/IP连接。确保到端口的TCP连接没有被防火墙阻止。” 我的代码---

    • 问题内容: 我使用MySql.Data 8.08和.NET Core连接到MySql 5.7.18,但引发了以下异常: 怎么处理呢? 问题答案: 从MySql.Data 7.0.7迁移到8.0.8时,我今天遇到了同样的问题。我能够继续在连接字符串中添加“ SslMode = none”。 您将得到类似以下内容的结果: (用数据库详细信息替换值)

    • 试图连接一个简单的JMX监控。托管应用程序和监控工具位于同一台服务器上。当试图连接一个错误 00:30:55610致命http-8080-6 SiteListener:makeJmxConnection:99-java.io。IOException:检索RMIServer存根失败:javax.naming。ServiceUnavailableException[根异常为java.rmi.Conne