当前位置: 首页 > 知识库问答 >
问题:

Win10上的Spark安装错误为“无”。组织。阿帕奇。火花应用程序编程接口。JAVAJavaSparkContext

史烨
2023-03-14

最近我一直在努力让Spark在我的Windows 10设备上运行,但没有成功。我只想试用Spark并能够遵循教程,因此我目前无法访问要连接的集群。为了安装Spark,我根据本教程完成了以下步骤:

  • 我安装了Java JDK,并将其放在C:\JDK上。文件夹中有binconfincludejmodslegallib文件夹
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
21/12/05 12:22:47 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
21/12/05 12:22:47 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)
java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:480)
py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
py4j.Gateway.invoke(Gateway.java:238)
py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
py4j.ClientServerConnection.run(ClientServerConnection.java:106)
java.base/java.lang.Thread.run(Thread.java:833)
C:\Spark\spark-3.2.0-bin-hadoop3.2\python\pyspark\shell.py:42: UserWarning: Failed to initialize Spark session.
  warnings.warn("Failed to initialize Spark session.")
Traceback (most recent call last):
  File "C:\Spark\spark-3.2.0-bin-hadoop3.2\python\pyspark\shell.py", line 38, in <module>
    spark = SparkSession._create_shell_session()  # type: ignore
  File "C:\Spark\spark-3.2.0-bin-hadoop3.2\python\pyspark\sql\session.py", line 553, in _create_shell_session
    return SparkSession.builder.getOrCreate()
  File "C:\Spark\spark-3.2.0-bin-hadoop3.2\python\pyspark\sql\session.py", line 228, in getOrCreate
    sc = SparkContext.getOrCreate(sparkConf)
  File "C:\Spark\spark-3.2.0-bin-hadoop3.2\python\pyspark\context.py", line 392, in getOrCreate
    SparkContext(conf=conf or SparkConf())
  File "C:\Spark\spark-3.2.0-bin-hadoop3.2\python\pyspark\context.py", line 146, in __init__
    self._do_init(master, appName, sparkHome, pyFiles, environment, batchSize, serializer,
  File "C:\Spark\spark-3.2.0-bin-hadoop3.2\python\pyspark\context.py", line 209, in _do_init
    self._jsc = jsc or self._initialize_context(self._conf._jconf)
  File "C:\Spark\spark-3.2.0-bin-hadoop3.2\python\pyspark\context.py", line 329, in _initialize_context
    return self._jvm.JavaSparkContext(jconf)
  File "C:\Users\lazarea\Anaconda3\lib\site-packages\py4j\java_gateway.py", line 1573, in __call__
    return_value = get_return_value(
  File "C:\Users\lazarea\Anaconda3\lib\site-packages\py4j\protocol.py", line 326, in get_return_value
    raise Py4JJavaError(
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
        at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:110)
        at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:348)
        at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:287)
        at org.apache.spark.SparkEnv$.create(SparkEnv.scala:336)
        at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:191)
        at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:460)
        at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
        at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
        at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)
        at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:480)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
        at py4j.Gateway.invoke(Gateway.java:238)
        at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
        at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
        at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
        at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
        at java.base/java.lang.Thread.run(Thread.java:833)

我在Stackoverflow上四处寻找类似的问题,发现了这个问题。这有一个类似的错误消息。然而,提供的解决方案,即将SPARK_LOCAL_IP用户环境变量设置为localhost并没有解决这个问题,在Anaconda提示符下键入pyspark时,同样的错误消息仍然存在。

注#1,这可能与此相关:在命令行中键入pyspark时,不提供任何输出。默认情况下,Windows会打开Microsoft应用商店。

注#2:我尝试直接用Python编写代码,看看是否还有其他提示。我运行了以下代码片段:

from pyspark.sql import SparkSession
spark = SparkSession.builder.appName('sampleApp').getOrCreate()

它返回了与上面类似的错误消息,以及一些可能有用的信息:

An error occurred html" target="_blank">while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$
(in unnamed module @0x776b83cc) cannot access class sun.nio.ch.DirectBuffer
(in module java.base) because module java.base does not export sun.nio.ch
to unnamed module @0x776b83cc

注#3:打开命令行并键入spark shell时,输出以下错误:

java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x3c947bc5) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x3c947bc5
  at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:213)
  at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
  at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:110)
  at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:348)
  at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:287)
  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:336)
  at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:191)
  at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:460)
  at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)
  at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949)
  at scala.Option.getOrElse(Option.scala:189)
  at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
  at org.apache.spark.repl.Main$.createSparkSession(Main.scala:106)
  ... 55 elided
<console>:14: error: not found: value spark
       import spark.implicits._
              ^
<console>:14: error: not found: value spark
       import spark.sql
              ^

请帮助我成功启动Spark,因为我无法理解此时我可能缺少什么。

共有1个答案

董谦
2023-03-14

最后,我成功了,所以让我分享我所学到的知识,以备将来其他人在安装Apache Spark时遇到困难时参考。在Windows 10机器上安装Apache Spark时,有三个关键方面。

>

  • 确保你已经安装了Java8!我们中的许多人陷入了下载Apache Spark不支持的现在默认的Java17的陷阱。有一个选项可以在Java8或Java11之间进行选择但基于此线程的讨论,我得出结论,对于我的快速POC示例,它不值得所有Java11 JDK和JRE的麻烦,因此我选择了Java8,JDK和JRE都可以轻松地从Oracle网站下载。请注意,您选择的较晚版本越安全,因此对于任何更严重的问题,我可能会选择Java11。

    将新安装的Java文件夹移动到C驱动器。为Java8JDK创建一个C:\jdk文件夹,为Java8JRE创建一个C:\jre。这样,就不需要JAVA_HOME环境变量,因为它们都位于C驱动器的底部。

    使用旧版本的Spark!事实证明,目前在Apache Spark网站上提供的最新稳定版本,2021年10月起的3.2.0已多次被报道在初始化Spark Context时提供此类和其他类似问题。因此,我尝试回滚到以前的版本。具体来说,我下载了2021年6月发布的Apache Spark版本3.0.3,并将SPARK_HOME环境变量指向新提取的文件夹,位于:C:\Spark\spack-3.0.3-bin-hadoop2.7

    在所有这些修改之后,我关闭了所有命令行窗口,打开了一个新窗口,运行了spark shell,最后我得到了备受欢迎的spark屏幕:

    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _\ \/ _ \/ _ `/ __/  '_/
       /___/ .__/\_,_/_/ /_/\_\   version 3.0.3
          /_/
    
    Using Scala version 2.12.10 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_301)
    

  •  类似资料: