当前位置: 首页 > 知识库问答 >
问题:

: 22:错误:未找到:值sc

南门飞扬
2023-03-14

我对Spark是全新的,Spark的学习正在进行中。在实践中,面临以下几个问题。多个步骤和安静的长。我正在UNIX环境中使用spark shell。获取错误如下所示。

步骤1


    $ spark-shell
    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _\ \/ _ \/ _ `/ __/  '_/
       /___/ .__/\_,_/_/ /_/\_\   version 1.3.1
          /_/

    Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_25)
    Type in expressions to have them evaluated.
    Type :help for more information.
    2016-04-22 07:44:31,5095 ERROR JniCommon fs/client/fileclient/cc/jni_MapRClient.cc:1473 Thread: 20535 mkdirs failed for /user/cni/.sparkStaging/application_1459074732364_1192326, error 13
    org.apache.hadoop.security.AccessControlException: User cni(user id 5689)  has been denied access to create application_1459074732364_1192326
            at com.mapr.fs.MapRFileSystem.makeDir(MapRFileSystem.java:1100)
            at com.mapr.fs.MapRFileSystem.mkdirs(MapRFileSystem.java:1120)
            at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1851)
            at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:631)
            at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:224)
            at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:384)
            at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:102)
            at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:58)
            at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141)
            at org.apache.spark.SparkContext.(SparkContext.scala:381)
            at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1016)
            at $iwC$$iwC.(:9)
            at $iwC.(:18)
            at (:20)
            at .(:24)
            at .()
            at .(:7)
            at .()
            at $print()
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:606)
            at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
            at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
            at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
            at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
            at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
            at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
            at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
            at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
            at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:123)
            at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
            at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
            at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
            at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
            at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:973)
            at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
            at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
            at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
            at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
            at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:990)
            at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
            at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
            at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
            at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
            at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
            at org.apache.spark.repl.Main$.main(Main.scala:31)
            at org.apache.spark.repl.Main.main(Main.scala)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:606)
            at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
            at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
            at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
            at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
            at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

    java.lang.NullPointerException
            at org.apache.spark.sql.SQLContext.(SQLContext.scala:145)
            at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:49)
            at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
            at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
            at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
            at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
            at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1027)
            at $iwC$$iwC.(:9)
            at $iwC.(:18)
            at (:20)
            at .(:24)
            at .()
            at .(:7)
            at .()
            at $print()
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:606)
            at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
            at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
            at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
            at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
            at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
            at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
            at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
            at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
            at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:130)
            at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
            at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
            at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
            at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
            at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:973)
            at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
            at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
            at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
            at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
            at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:990)
            at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
            at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
            at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
            at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
            at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
            at org.apache.spark.repl.Main$.main(Main.scala:31)
            at org.apache.spark.repl.Main.main(Main.scala)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:606)
            at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
            at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
            at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
            at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
            at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

    :10: error: not found: value sqlContext
           import sqlContext.implicits._
                  ^
    :10: error: not found: value sqlContext
           import sqlContext.sql
                  ^
    

步骤2:

我只是忽略了上面的警告/错误,继续编写代码。我读到,如果我使用spark shell,sc将自动创建,代码如下。

<pre>
scala> val textFile = sc.textFile("README.md")
<console>:13: error: not found: value sc
       val textFile = sc.textFile("README.md")
</pre>

第3步:正如上面所说的,sc未找到,尝试创建它。

scala> import org.apache.spark._
import org.apache.spark._

scala> import org.apache.spark.streaming._
import org.apache.spark.streaming._

scala> import org.apache.spark.streaming.StreamingContext._
import org.apache.spark.streaming.StreamingContext._

scala> val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount").set("spark.ui.port", "44040" ).set("spark.driver.allowMultipleContexts", "true")
conf: org.apache.spark.SparkConf = org.apache.spark.SparkConf@1a58697d

scala> val ssc = new StreamingContext(conf, Seconds(2) )
16/04/22 08:19:18 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor).  This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243). The other SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:80)
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1016)
$line3.$read$$iwC$$iwC.<init>(<console>:9)
$line3.$read$$iwC.<init>(<console>:18)
$line3.$read.<init>(<console>:20)
$line3.$read$.<init>(<console>:24)
$line3.$read$.<clinit>(<console>)
$line3.$eval$.<init>(<console>:7)
$line3.$eval$.<clinit>(<console>)
$line3.$eval.$print(<console>)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:606)
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
ssc: org.apache.spark.streaming.StreamingContext = org.apache.spark.streaming.StreamingContext@15492914

正如火花告诉我这是警告(当然也说,它可能表示错误),所以忽略并继续创建RDD。再次,在这里我不确定,这是一个错误/警告???

第四步

创建RDD如下。

<pre>

scala> var fil = ssc.textFile("/mapr/datalake/01.Call_ID.txt")
<console>:21: error: value textFile is not a member of org.apache.spark.streaming.StreamingContext
       var fil = ssc.textFile("/mapr/datalake/01.Call_ID.txt")
                     ^

</pre>

在这里,它是说我文本文件是不是一个成员的StreingContext.所有这些我都要疯了。此外,我为一家公司工作,在公司的笔记本电脑(JFYI)中执行脚本。

共有2个答案

吕修伟
2023-03-14

HDFS中的用户目录中创建文件夹似乎有问题。

检查文件夹:/user/cni的权限/

您可以尝试使用以下命令授予对用户文件夹的所有访问权限:

hdfs dfs -chmod -R 777 /user/cni

在共享集群或生产环境中不建议这样做,但它可以帮助您确定这是否是一个访问问题。

洪祺
2023-03-14

我认为这一切都是由于缺乏权限。假设您拥有正确的访问权限来使用您可以键入的群集

HADOOP\u USER\u NAME=hdfs spark shell

这将覆盖您帐户的权限。

 类似资料:
  • 我刚从Spark开始。我已经用Spark安装了CDH5。然而,当我尝试使用sparkcontext时,它给出了如下错误 我对此进行了研究,发现了错误:未找到:值sc 并试图启动火花上下文。/Spark-shell。它给错误

  • 我不确定我是应该回应这个帖子(CodenameOne在构建后停止工作)还是发布一个新的帖子。 我使用的IDE是IntelliJ。当我打开一个codename one project时,它无法识别它是一个codename one project(codename one按钮是灰色的)。

  • 一旦我导入FormsMoules我开始得到这个错误。我正在使用MatFormField,它似乎无法与FormsMoules一起工作。

  • Traceback(最近的最后一次调用):文件"C:/用户/AppData/本地/程序/Python/Python37/client.py",第54行,引号=json.loads(urllib.request.urlopen(QUERY.格式(random.random())). read())文件"C:\用户\AppData\本地\程序\Python\Python37\lib\urllib\re

  • 我正在尝试使用带有xslt的外部图形来生成PDF。大多数图像都工作正常,但偶尔会有一张“找不到”,尽管可以在Web浏览器上查看。FOP吐出的错误如下: 这是我的外部图形部分: 知道我做错了什么吗? 编辑:看起来这个问题与服务器不允许访问自动请求有关。有没有办法在fop 2.1中设置用户代理的URIResolver?这一功能似乎存在于以前的版本中,但我似乎找不到一种在2.1中实现的方法。

  • 我在Google Chrome开发者工具中发现了这个错误: jquery-2.0.2.min.map没找到 我从我的中删除了这一行,找到了摆脱它的方法: 然而,我不认为这是一个好主意,因为这可能只是一个暂时的解决方案,在未来可能会成为一个问题。因为我并不真正理解这个错误的本质和愚蠢的解决方案:是什么导致了这个错误,有没有更好的解决方法? 显然,这不是一个只与jQuery 2.0.2相关的问题。非常