当前位置: 首页 > 知识库问答 >
问题:

Spark 2.1.0:读取压缩的csv文件

勾喜
2023-03-14

我正在尝试将压缩的csv文件(.bz2)读取为数据帧。我的代码如下

// read the data
    Dataset<Row> rData = spark.read().option("header", true).csv(input);

当我在IDE中尝试时,这是可行的。我可以读取数据并对其进行处理,但当我尝试使用maven构建数据并在命令行上运行它时,会出现以下错误

    Exception in thread "main" java.lang.ClassNotFoundException: Failed to find data source: csv. Please find packages at http://spark.apache.org/third-party-projects.html
    at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:569)
    at org.apache.spark.sql.execution.datasources.DataSource.providingClass$lzycompute(DataSource.scala:86)
    at org.apache.spark.sql.execution.datasources.DataSource.providingClass(DataSource.scala:86)
    at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:325)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:152)
    at org.apache.spark.sql.DataFrameReader.csv(DataFrameReader.scala:415)
    at org.apache.spark.sql.DataFrameReader.csv(DataFrameReader.scala:352)
    at com.cs6240.Driver.main(Driver.java:28)
Caused by: java.lang.ClassNotFoundException: csv.DefaultSource
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25$$anonfun$apply$13.apply(DataSource.scala:554)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25$$anonfun$apply$13.apply(DataSource.scala:554)
    at scala.util.Try$.apply(Try.scala:192)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25.apply(DataSource.scala:554)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25.apply(DataSource.scala:554)
    at scala.util.Try.orElse(Try.scala:84)
    at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:554)
    ... 7 more

我不确定我是否在这里错过了什么。读取csv文件是否有一些依赖项?根据留档,Spark 2. x. x内置了对此的支持。

共有1个答案

段干靖
2023-03-14

我按照这个答案中的步骤解决了这个问题。https://stackoverflow.com/a/39465892/2705924

基本上有一些问题与装配插件和当我使用阴影插件和使用这个

<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
 类似资料: