当前位置: 首页 > 面试题库 >

Apache Spark和Java错误- 由以下原因引起:java.lang.StringIndexOutOfBoundsException:开始0,结束3,长度2

唐照
2023-03-14
问题内容

我是Spark框架的新手。我试图使用spark和java创建一个示例应用程序。我有以下代码

Pom.xml

<dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.10</artifactId>
      <version>1.6.1</version>
</dependency>

资源

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.*;

public class SparkTest {
    public static void main(String[] args)  {
        SparkConf sparkConf = new SparkConf()
                .setAppName("Example Spark App")
                .setMaster("local[*]"); // Delete this line when submitting to a cluster
        JavaSparkContext sparkContext = new JavaSparkContext(sparkConf);
        JavaRDD<String> stringJavaRDD = sparkContext.textFile("nationalparks.csv");
        System.out.println("Number of lines in file = " + stringJavaRDD.count());
    }
}

我正在尝试使用IntelliJ IDE运行以上代码。但是我有这样的错误

"C:\Program Files\Java\jdk-11\bin\java.exe" "-javaagent:C:\Users\amanaf\AppData\Local\JetBrains\IntelliJ IDEA Community Edition 2018.3\lib\idea_rt.jar=55665:C:\Users\amanaf\AppData\Local\JetBrains\IntelliJ IDEA Community Edition 2018.3\bin" -Dfile.encoding=UTF-8 -classpath C:\Users\amanaf\IdeaProjects\testApp\target\classes;C:\Users\amanaf\.m2\repository\org\apache\spark\spark-core_2.10\1.6.1\spark-core_2.10-1.6.1.jar;C:\Users\amanaf\.m2\repository\org\apache\avro\avro-mapred\1.7.7\avro-mapred-1.7.7-hadoop2.jar;C:\Users\amanaf\.m2\repository\org\apache\avro\avro-ipc\1.7.7\avro-ipc-1.7.7.jar;C:\Users\amanaf\.m2\repository\org\apache\avro\avro\1.7.7\avro-1.7.7.jar;C:\Users\amanaf\.m2\repository\org\apache\avro\avro-ipc\1.7.7\avro-ipc-1.7.7-tests.jar;C:\Users\amanaf\.m2\repository\org\codehaus\jackson\jackson-core-asl\1.9.13\jackson-core-asl-1.9.13.jar;C:\Users\amanaf\.m2\repository\org\codehaus\jackson\jackson-mapper-asl\1.9.13\jackson-mapper-asl-1.9.13.jar;C:\Users\amanaf\.m2\repository\com\twitter\chill_2.10\0.5.0\chill_2.10-0.5.0.jar;C:\Users\amanaf\.m2\repository\com\esotericsoftware\kryo\kryo\2.21\kryo-2.21.jar;C:\Users\amanaf\.m2\repository\com\esotericsoftware\reflectasm\reflectasm\1.07\reflectasm-1.07-shaded.jar;C:\Users\amanaf\.m2\repository\com\esotericsoftware\minlog\minlog\1.2\minlog-1.2.jar;C:\Users\amanaf\.m2\repository\org\objenesis\objenesis\1.2\objenesis-1.2.jar;C:\Users\amanaf\.m2\repository\com\twitter\chill-java\0.5.0\chill-java-0.5.0.jar;C:\Users\amanaf\.m2\repository\org\apache\xbean\xbean-asm5-shaded\4.4\xbean-asm5-shaded-4.4.jar;C:\Users\amanaf\.m2\repository\org\apache\hadoop\hadoop-client\2.2.0\hadoop-client-2.2.0.jar;C:\Users\amanaf\.m2\repository\org\apache\hadoop\hadoop-common\2.2.0\hadoop-common-2.2.0.jar;C:\Users\amanaf\.m2\repository\commons-cli\commons-cli\1.2\commons-cli-1.2.jar;C:\Users\amanaf\.m2\repository\org\apache\commons\commons-math\2.1\commons-math-2.1.jar;C:\Users\amanaf\.m2\repository\xmlenc\xmlenc\0.52\xmlenc-0.52.jar;C:\Users\amanaf\.m2\repository\commons-configuration\commons-configuration\1.6\commons-configuration-1.6.jar;C:\Users\amanaf\.m2\repository\commons-collections\commons-collections\3.2.1\commons-collections-3.2.1.jar;C:\Users\amanaf\.m2\repository\commons-digester\commons-digester\1.8\commons-digester-1.8.jar;C:\Users\amanaf\.m2\repository\commons-beanutils\commons-beanutils\1.7.0\commons-beanutils-1.7.0.jar;C:\Users\amanaf\.m2\repository\commons-beanutils\commons-beanutils-core\1.8.0\commons-beanutils-core-1.8.0.jar;C:\Users\amanaf\.m2\repository\org\apache\hadoop\hadoop-auth\2.2.0\hadoop-auth-2.2.0.jar;C:\Users\amanaf\.m2\repository\org\apache\commons\commons-compress\1.4.1\commons-compress-1.4.1.jar;C:\Users\amanaf\.m2\repository\org\tukaani\xz\1.0\xz-1.0.jar;C:\Users\amanaf\.m2\repository\org\apache\hadoop\hadoop-hdfs\2.2.0\hadoop-hdfs-2.2.0.jar;C:\Users\amanaf\.m2\repository\org\mortbay\jetty\jetty-util\6.1.26\jetty-util-6.1.26.jar;C:\Users\amanaf\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-app\2.2.0\hadoop-mapreduce-client-app-2.2.0.jar;C:\Users\amanaf\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-common\2.2.0\hadoop-mapreduce-client-common-2.2.0.jar;C:\Users\amanaf\.m2\repository\org\apache\hadoop\hadoop-yarn-client\2.2.0\hadoop-yarn-client-2.2.0.jar;C:\Users\amanaf\.m2\repository\com\google\inject\guice\3.0\guice-3.0.jar;C:\Users\amanaf\.m2\repository\javax\inject\javax.inject\1\javax.inject-1.jar;C:\Users\amanaf\.m2\repository\aopalliance\aopalliance\1.0\aopalliance-1.0.jar;C:\Users\amanaf\.m2\repository\com\sun\jersey\jersey-test-framework\jersey-test-framework-grizzly2\1.9\jersey-test-framework-grizzly2-1.9.jar;C:\Users\amanaf\.m2\repository\com\sun\jersey\jersey-test-framework\jersey-test-framework-core\1.9\jersey-test-framework-core-1.9.jar;C:\Users\amanaf\.m2\repository\javax\servlet\javax.servlet-api\3.0.1\javax.servlet-api-3.0.1.jar;C:\Users\amanaf\.m2\repository\com\sun\jersey\jersey-client\1.9\jersey-client-1.9.jar;C:\Users\amanaf\.m2\repository\com\sun\jersey\jersey-grizzly2\1.9\jersey-grizzly2-1.9.jar;C:\Users\amanaf\.m2\repository\org\glassfish\grizzly\grizzly-http\2.1.2\grizzly-http-2.1.2.jar;C:\Users\amanaf\.m2\repository\org\glassfish\grizzly\grizzly-framework\2.1.2\grizzly-framework-2.1.2.jar;C:\Users\amanaf\.m2\repository\org\glassfish\gmbal\gmbal-api-only\3.0.0-b023\gmbal-api-only-3.0.0-b023.jar;C:\Users\amanaf\.m2\repository\org\glassfish\external\management-api\3.0.0-b012\management-api-3.0.0-b012.jar;C:\Users\amanaf\.m2\repository\org\glassfish\grizzly\grizzly-http-server\2.1.2\grizzly-http-server-2.1.2.jar;C:\Users\amanaf\.m2\repository\org\glassfish\grizzly\grizzly-rcm\2.1.2\grizzly-rcm-2.1.2.jar;C:\Users\amanaf\.m2\repository\org\glassfish\grizzly\grizzly-http-servlet\2.1.2\grizzly-http-servlet-2.1.2.jar;C:\Users\amanaf\.m2\repository\org\glassfish\javax.servlet\3.1\javax.servlet-3.1.jar;C:\Users\amanaf\.m2\repository\com\sun\jersey\jersey-json\1.9\jersey-json-1.9.jar;C:\Users\amanaf\.m2\repository\org\codehaus\jettison\jettison\1.1\jettison-1.1.jar;C:\Users\amanaf\.m2\repository\stax\stax-api\1.0.1\stax-api-1.0.1.jar;C:\Users\amanaf\.m2\repository\com\sun\xml\bind\jaxb-impl\2.2.3-1\jaxb-impl-2.2.3-1.jar;C:\Users\amanaf\.m2\repository\javax\xml\bind\jaxb-api\2.2.2\jaxb-api-2.2.2.jar;C:\Users\amanaf\.m2\repository\javax\activation\activation\1.1\activation-1.1.jar;C:\Users\amanaf\.m2\repository\org\codehaus\jackson\jackson-jaxrs\1.8.3\jackson-jaxrs-1.8.3.jar;C:\Users\amanaf\.m2\repository\org\codehaus\jackson\jackson-xc\1.8.3\jackson-xc-1.8.3.jar;C:\Users\amanaf\.m2\repository\com\sun\jersey\contribs\jersey-guice\1.9\jersey-guice-1.9.jar;C:\Users\amanaf\.m2\repository\org\apache\hadoop\hadoop-yarn-server-common\2.2.0\hadoop-yarn-server-common-2.2.0.jar;C:\Users\amanaf\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-shuffle\2.2.0\hadoop-mapreduce-client-shuffle-2.2.0.jar;C:\Users\amanaf\.m2\repository\org\apache\hadoop\hadoop-yarn-api\2.2.0\hadoop-yarn-api-2.2.0.jar;C:\Users\amanaf\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-core\2.2.0\hadoop-mapreduce-client-core-2.2.0.jar;C:\Users\amanaf\.m2\repository\org\apache\hadoop\hadoop-yarn-common\2.2.0\hadoop-yarn-common-2.2.0.jar;C:\Users\amanaf\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-jobclient\2.2.0\hadoop-mapreduce-client-jobclient-2.2.0.jar;C:\Users\amanaf\.m2\repository\org\apache\hadoop\hadoop-annotations\2.2.0\hadoop-annotations-2.2.0.jar;C:\Users\amanaf\.m2\repository\org\apache\spark\spark-launcher_2.10\1.6.1\spark-launcher_2.10-1.6.1.jar;C:\Users\amanaf\.m2\repository\org\apache\spark\spark-network-common_2.10\1.6.1\spark-network-common_2.10-1.6.1.jar;C:\Users\amanaf\.m2\repository\org\apache\spark\spark-network-shuffle_2.10\1.6.1\spark-network-shuffle_2.10-1.6.1.jar;C:\Users\amanaf\.m2\repository\org\fusesource\leveldbjni\leveldbjni-all\1.8\leveldbjni-all-1.8.jar;C:\Users\amanaf\.m2\repository\com\fasterxml\jackson\core\jackson-annotations\2.4.4\jackson-annotations-2.4.4.jar;C:\Users\amanaf\.m2\repository\org\apache\spark\spark-unsafe_2.10\1.6.1\spark-unsafe_2.10-1.6.1.jar;C:\Users\amanaf\.m2\repository\net\java\dev\jets3t\jets3t\0.7.1\jets3t-0.7.1.jar;C:\Users\amanaf\.m2\repository\commons-codec\commons-codec\1.3\commons-codec-1.3.jar;C:\Users\amanaf\.m2\repository\commons-httpclient\commons-httpclient\3.1\commons-httpclient-3.1.jar;C:\Users\amanaf\.m2\repository\org\apache\curator\curator-recipes\2.4.0\curator-recipes-2.4.0.jar;C:\Users\amanaf\.m2\repository\org\apache\curator\curator-framework\2.4.0\curator-framework-2.4.0.jar;C:\Users\amanaf\.m2\repository\org\apache\curator\curator-client\2.4.0\curator-client-2.4.0.jar;C:\Users\amanaf\.m2\repository\org\apache\zookeeper\zookeeper\3.4.5\zookeeper-3.4.5.jar;C:\Users\amanaf\.m2\repository\jline\jline\0.9.94\jline-0.9.94.jar;C:\Users\amanaf\.m2\repository\com\google\guava\guava\14.0.1\guava-14.0.1.jar;C:\Users\amanaf\.m2\repository\org\eclipse\jetty\orbit\javax.servlet\3.0.0.v201112011016\javax.servlet-3.0.0.v201112011016.jar;C:\Users\amanaf\.m2\repository\org\apache\commons\commons-lang3\3.3.2\commons-lang3-3.3.2.jar;C:\Users\amanaf\.m2\repository\org\apache\commons\commons-math3\3.4.1\commons-math3-3.4.1.jar;C:\Users\amanaf\.m2\repository\com\google\code\findbugs\jsr305\1.3.9\jsr305-1.3.9.jar;C:\Users\amanaf\.m2\repository\org\slf4j\slf4j-api\1.7.10\slf4j-api-1.7.10.jar;C:\Users\amanaf\.m2\repository\org\slf4j\jul-to-slf4j\1.7.10\jul-to-slf4j-1.7.10.jar;C:\Users\amanaf\.m2\repository\org\slf4j\jcl-over-slf4j\1.7.10\jcl-over-slf4j-1.7.10.jar;C:\Users\amanaf\.m2\repository\log4j\log4j\1.2.17\log4j-1.2.17.jar;C:\Users\amanaf\.m2\repository\org\slf4j\slf4j-log4j12\1.7.10\slf4j-log4j12-1.7.10.jar;C:\Users\amanaf\.m2\repository\com\ning\compress-lzf\1.0.3\compress-lzf-1.0.3.jar;C:\Users\amanaf\.m2\repository\org\xerial\snappy\snappy-java\1.1.2\snappy-java-1.1.2.jar;C:\Users\amanaf\.m2\repository\net\jpountz\lz4\lz4\1.3.0\lz4-1.3.0.jar;C:\Users\amanaf\.m2\repository\org\roaringbitmap\RoaringBitmap\0.5.11\RoaringBitmap-0.5.11.jar;C:\Users\amanaf\.m2\repository\commons-net\commons-net\2.2\commons-net-2.2.jar;C:\Users\amanaf\.m2\repository\com\typesafe\akka\akka-remote_2.10\2.3.11\akka-remote_2.10-2.3.11.jar;C:\Users\amanaf\.m2\repository\com\typesafe\akka\akka-actor_2.10\2.3.11\akka-actor_2.10-2.3.11.jar;C:\Users\amanaf\.m2\repository\com\typesafe\config\1.2.1\config-1.2.1.jar;C:\Users\amanaf\.m2\repository\io\netty\netty\3.8.0.Final\netty-3.8.0.Final.jar;C:\Users\amanaf\.m2\repository\com\google\protobuf\protobuf-java\2.5.0\protobuf-java-2.5.0.jar;C:\Users\amanaf\.m2\repository\org\uncommons\maths\uncommons-maths\1.2.2a\uncommons-maths-1.2.2a.jar;C:\Users\amanaf\.m2\repository\com\typesafe\akka\akka-slf4j_2.10\2.3.11\akka-slf4j_2.10-2.3.11.jar;C:\Users\amanaf\.m2\repository\org\scala-lang\scala-library\2.10.5\scala-library-2.10.5.jar;C:\Users\amanaf\.m2\repository\org\json4s\json4s-jackson_2.10\3.2.10\json4s-jackson_2.10-3.2.10.jar;C:\Users\amanaf\.m2\repository\org\json4s\json4s-core_2.10\3.2.10\json4s-core_2.10-3.2.10.jar;C:\Users\amanaf\.m2\repository\org\json4s\json4s-ast_2.10\3.2.10\json4s-ast_2.10-3.2.10.jar;C:\Users\amanaf\.m2\repository\org\scala-lang\scalap\2.10.0\scalap-2.10.0.jar;C:\Users\amanaf\.m2\repository\org\scala-lang\scala-compiler\2.10.0\scala-compiler-2.10.0.jar;C:\Users\amanaf\.m2\repository\com\sun\jersey\jersey-server\1.9\jersey-server-1.9.jar;C:\Users\amanaf\.m2\repository\asm\asm\3.1\asm-3.1.jar;C:\Users\amanaf\.m2\repository\com\sun\jersey\jersey-core\1.9\jersey-core-1.9.jar;C:\Users\amanaf\.m2\repository\org\apache\mesos\mesos\0.21.1\mesos-0.21.1-shaded-protobuf.jar;C:\Users\amanaf\.m2\repository\io\netty\netty-all\4.0.29.Final\netty-all-4.0.29.Final.jar;C:\Users\amanaf\.m2\repository\com\clearspring\analytics\stream\2.7.0\stream-2.7.0.jar;C:\Users\amanaf\.m2\repository\io\dropwizard\metrics\metrics-core\3.1.2\metrics-core-3.1.2.jar;C:\Users\amanaf\.m2\repository\io\dropwizard\metrics\metrics-jvm\3.1.2\metrics-jvm-3.1.2.jar;C:\Users\amanaf\.m2\repository\io\dropwizard\metrics\metrics-json\3.1.2\metrics-json-3.1.2.jar;C:\Users\amanaf\.m2\repository\io\dropwizard\metrics\metrics-graphite\3.1.2\metrics-graphite-3.1.2.jar;C:\Users\amanaf\.m2\repository\com\fasterxml\jackson\core\jackson-databind\2.4.4\jackson-databind-2.4.4.jar;C:\Users\amanaf\.m2\repository\com\fasterxml\jackson\core\jackson-core\2.4.4\jackson-core-2.4.4.jar;C:\Users\amanaf\.m2\repository\com\fasterxml\jackson\module\jackson-module-scala_2.10\2.4.4\jackson-module-scala_2.10-2.4.4.jar;C:\Users\amanaf\.m2\repository\org\scala-lang\scala-reflect\2.10.4\scala-reflect-2.10.4.jar;C:\Users\amanaf\.m2\repository\com\thoughtworks\paranamer\paranamer\2.6\paranamer-2.6.jar;C:\Users\amanaf\.m2\repository\org\apache\ivy\ivy\2.4.0\ivy-2.4.0.jar;C:\Users\amanaf\.m2\repository\oro\oro\2.0.8\oro-2.0.8.jar;C:\Users\amanaf\.m2\repository\org\tachyonproject\tachyon-client\0.8.2\tachyon-client-0.8.2.jar;C:\Users\amanaf\.m2\repository\commons-lang\commons-lang\2.4\commons-lang-2.4.jar;C:\Users\amanaf\.m2\repository\commons-io\commons-io\2.4\commons-io-2.4.jar;C:\Users\amanaf\.m2\repository\org\tachyonproject\tachyon-underfs-hdfs\0.8.2\tachyon-underfs-hdfs-0.8.2.jar;C:\Users\amanaf\.m2\repository\org\tachyonproject\tachyon-underfs-s3\0.8.2\tachyon-underfs-s3-0.8.2.jar;C:\Users\amanaf\.m2\repository\org\tachyonproject\tachyon-underfs-local\0.8.2\tachyon-underfs-local-0.8.2.jar;C:\Users\amanaf\.m2\repository\net\razorvine\pyrolite\4.9\pyrolite-4.9.jar;C:\Users\amanaf\.m2\repository\net\sf\py4j\py4j\0.9\py4j-0.9.jar;C:\Users\amanaf\.m2\repository\org\spark-project\spark\unused\1.0.0\unused-1.0.0.jar SparkTest
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
18/11/29 06:02:59 INFO SparkContext: Running Spark version 1.6.1
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/C:/Users/amanaf/.m2/repository/org/apache/hadoop/hadoop-auth/2.2.0/hadoop-auth-2.2.0.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
18/11/29 06:02:59 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/11/29 06:02:59 INFO SecurityManager: Changing view acls to: amanaf
18/11/29 06:02:59 INFO SecurityManager: Changing modify acls to: amanaf
18/11/29 06:02:59 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(amanaf); users with modify permissions: Set(amanaf)
18/11/29 06:03:00 INFO PlatformDependent: Your platform does not provide complete low-level API for accessing direct buffers reliably. Unless explicitly requested, heap buffer will always be preferred to avoid potential system unstability.
18/11/29 06:03:00 INFO Utils: Successfully started service 'sparkDriver' on port 55702.
18/11/29 06:03:00 INFO Slf4jLogger: Slf4jLogger started
18/11/29 06:03:00 INFO Remoting: Starting remoting
18/11/29 06:03:00 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@172.20.255.74:55715]
18/11/29 06:03:00 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 55715.
18/11/29 06:03:00 INFO SparkEnv: Registering MapOutputTracker
18/11/29 06:03:00 INFO SparkEnv: Registering BlockManagerMaster
18/11/29 06:03:00 INFO DiskBlockManager: Created local directory at C:\Users\amanaf\AppData\Local\Temp\blockmgr-183dfab1-dc04-401d-9b91-6caf7861709d
18/11/29 06:03:00 INFO MemoryStore: MemoryStore started with capacity 2.8 GB
18/11/29 06:03:00 INFO SparkEnv: Registering OutputCommitCoordinator
18/11/29 06:03:00 INFO Utils: Successfully started service 'SparkUI' on port 4040.
18/11/29 06:03:00 INFO SparkUI: Started SparkUI at http://172.20.255.74:4040
18/11/29 06:03:01 INFO Executor: Starting executor ID driver on host localhost
18/11/29 06:03:01 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 55752.
18/11/29 06:03:01 INFO NettyBlockTransferService: Server created on 55752
18/11/29 06:03:01 INFO BlockManagerMaster: Trying to register BlockManager
18/11/29 06:03:01 INFO BlockManagerMasterEndpoint: Registering block manager localhost:55752 with 2.8 GB RAM, BlockManagerId(driver, localhost, 55752)
18/11/29 06:03:01 INFO BlockManagerMaster: Registered BlockManager
18/11/29 06:03:01 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 73.9 KB, free 73.9 KB)
18/11/29 06:03:01 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 9.8 KB, free 83.7 KB)
18/11/29 06:03:01 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:55752 (size: 9.8 KB, free: 2.8 GB)
18/11/29 06:03:01 INFO SparkContext: Created broadcast 0 from textFile at SparkTest.java:12
Exception in thread "main" java.lang.ExceptionInInitializerError
    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
    at org.apache.hadoop.mapred.FileInputFormat.setInputPaths(FileInputFormat.java:362)
    at org.apache.spark.SparkContext$$anonfun$hadoopFile$1$$anonfun$33.apply(SparkContext.scala:1015)
    at org.apache.spark.SparkContext$$anonfun$hadoopFile$1$$anonfun$33.apply(SparkContext.scala:1015)
    at org.apache.spark.rdd.HadoopRDD$$anonfun$getJobConf$6.apply(HadoopRDD.scala:176)
    at org.apache.spark.rdd.HadoopRDD$$anonfun$getJobConf$6.apply(HadoopRDD.scala:176)
    at scala.Option.map(Option.scala:145)
    at org.apache.spark.rdd.HadoopRDD.getJobConf(HadoopRDD.scala:176)
    at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:195)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
    at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1929)
    at org.apache.spark.rdd.RDD.count(RDD.scala:1157)
    at org.apache.spark.api.java.JavaRDDLike$class.count(JavaRDDLike.scala:440)
    at org.apache.spark.api.java.AbstractJavaRDDLike.count(JavaRDDLike.scala:46)
    at SparkTest.main(SparkTest.java:13)
Caused by: java.lang.StringIndexOutOfBoundsException: begin 0, end 3, length 2
    at java.base/java.lang.String.checkBoundsBeginEnd(String.java:3319)
    at java.base/java.lang.String.substring(String.java:1874)
    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:48)
    ... 23 more
18/11/29 06:03:01 INFO SparkContext: Invoking stop() from shutdown hook
18/11/29 06:03:01 INFO SparkUI: Stopped Spark web UI at http://172.20.255.74:4040
18/11/29 06:03:01 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
18/11/29 06:03:01 INFO MemoryStore: MemoryStore cleared
18/11/29 06:03:01 INFO BlockManager: BlockManager stopped
18/11/29 06:03:01 INFO BlockManagerMaster: BlockManagerMaster stopped
18/11/29 06:03:01 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
18/11/29 06:03:01 INFO SparkContext: Successfully stopped SparkContext
18/11/29 06:03:01 INFO ShutdownHookManager: Shutdown hook called
18/11/29 06:03:01 INFO ShutdownHookManager: Deleting directory C:\Users\amanaf\AppData\Local\Temp\spark-38128353-d1ea-4f8e-9edb-62b97a6fa4b5
18/11/29 06:03:01 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.

Process finished with exit code 1

我在代码中错过了什么吗?


问题答案:

您的日志中的实际错误是:

引起原因:java.lang.StringIndexOutOfBoundsException:开始0,结束3,长度2

这是由于hadoop-commonJava
9及更高版本的库中的问题。有关此错误的详细信息,请参阅https://issues.apache.org/jira/browse/HADOOP-14586。

此问题将在Spark 3.0.0发行版中修复。请参阅https://issues.apache.org/jira/browse/SPARK-26134。因此,现在您可以将Java版本降级为Java
8。



 类似资料:
  • 问题内容: 当我通过SOAP UI运行WS时,我间歇性地收到以下错误。有时它不工作,然后继续工作,然后有时又不工作。另一个问题是,由客户端提供的测试Web服务运行正常,没有任何问题,但是当我们切换到产生问题的产品时。谷歌搜索并做了一些更改(HttpConfig上的超时,码头maxIdleTime),但仍然无法使其工作:(任何想法,我如何可以缩小问题的根源? 只需将SOAP UI与生产端点一起使用即

  • 我使用spring数据来创建jpa和mongo。 附属国: spring版本是4.0。2.释放 SpringDataJPA版本是1.4。3.释放 spring数据mongodb版本为1.2。0.1释放 XML配置: 型号: 例外: 有人知道吗? 非常感谢。。

  • 问题内容: 在关于sqlite的android vogella教程期间,我遇到了令人讨厌的问题,并出现以下错误: 我无数次地一步步修改所有文件,以正确重新创建R. class。关于此错误的非常奇怪的是,我不知道为什么在“ com.example.de.vogella.android”中有关于“ com.example”的任何内容。这是某种标准生成的名称吗?我所有的包/名称/路径都从未以“ com.

  • 实际上,我正在使用Kotlin开发SpringBootV2应用程序。我在版本9中使用JDK。 在添加了依赖项后,我得到了ClassNot找到错误: 原因:org。springframework。豆。工厂BeanCreationException:创建名为“entityManagerFactory”的bean时出错,该bean在类路径资源[org/springframework/boot/autoc

  • 在创建应用程序时,当我尝试在Android studio中运行测试时,会出现以下错误: 我正在关注一个视频,它有相同的代码,并得到绿色(测试已经通过),所以它应该工作。我右键点击这里的“com.example.android.sunshine.app”文件夹: 然后选择运行测试,他们在视频中做什么,但我仍然得到错误。与我的Android Studio相比,视频中的唯一区别是: 它说“在com.ex

  • 我想使用改装2从服务器检索数据。我已经读了很多同一个问题的答案,但我不知道问题是什么,这个和这个。然而,我一次又一次地遇到同样的问题。 这是我的接口课 这是我的第一堂模特课 这是我的第二节模型课 这是我的改装经理班 日志Cat错误是: