当前位置: 首页 > 知识库问答 >
问题:

安装cassandra火花接头

萧鸿轩
2023-03-14
https://github.com/datastax/spark-cassandra-connector
http://spark-packages.org/package/datastax/spark-cassandra-connector
[idf@node1 bin]$ spark-shell --packages datastax:spark-cassandra-connector:1.6.0-M1-s_2.11
Ivy Default Cache set to: /home/idf/.ivy2/cache
The jars for the packages stored in: /home/idf/.ivy2/jars
:: loading settings :: url = jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
datastax#spark-cassandra-connector added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
        confs: [default]
        found datastax#spark-cassandra-connector;1.6.0-M1-s_2.11 in spark-packages
        found org.apache.cassandra#cassandra-clientutil;3.0.2 in central
        found com.datastax.cassandra#cassandra-driver-core;3.0.0 in central
        found io.netty#netty-handler;4.0.33.Final in central
        found io.netty#netty-buffer;4.0.33.Final in central
        found io.netty#netty-common;4.0.33.Final in central
        found io.netty#netty-transport;4.0.33.Final in central
        found io.netty#netty-codec;4.0.33.Final in central
        found io.dropwizard.metrics#metrics-core;3.1.2 in list
        found org.slf4j#slf4j-api;1.7.7 in central
        found org.apache.commons#commons-lang3;3.3.2 in list
        found com.google.guava#guava;16.0.1 in central
        found org.joda#joda-convert;1.2 in central
        found joda-time#joda-time;2.3 in central
        found com.twitter#jsr166e;1.1.0 in central
        found org.scala-lang#scala-reflect;2.11.7 in list
        [2.11.7] org.scala-lang#scala-reflect;2.11.7
downloading http://dl.bintray.com/spark-packages/maven/datastax/spark-cassandra-connector/1.6.0-M1-s_2.11/spark-cassandra-connector-1.6.0-M1-s_2.11.jar ...
        [SUCCESSFUL ] datastax#spark-cassandra-connector;1.6.0-M1-s_2.11!spark-cassandra-connector.jar (2430ms)
downloading https://repo1.maven.org/maven2/org/apache/cassandra/cassandra-clientutil/3.0.2/cassandra-clientutil-3.0.2.jar ...
        [SUCCESSFUL ] org.apache.cassandra#cassandra-clientutil;3.0.2!cassandra-clientutil.jar (195ms)
downloading https://repo1.maven.org/maven2/com/datastax/cassandra/cassandra-driver-core/3.0.0/cassandra-driver-core-3.0.0.jar ...
        [SUCCESSFUL ] com.datastax.cassandra#cassandra-driver-core;3.0.0!cassandra-driver-core.jar(bundle) (874ms)
downloading https://repo1.maven.org/maven2/com/google/guava/guava/16.0.1/guava-16.0.1.jar ...
        [SUCCESSFUL ] com.google.guava#guava;16.0.1!guava.jar(bundle) (1930ms)
downloading https://repo1.maven.org/maven2/org/joda/joda-convert/1.2/joda-convert-1.2.jar ...
        [SUCCESSFUL ] org.joda#joda-convert;1.2!joda-convert.jar (68ms)
downloading https://repo1.maven.org/maven2/joda-time/joda-time/2.3/joda-time-2.3.jar ...
        [SUCCESSFUL ] joda-time#joda-time;2.3!joda-time.jar (524ms)
downloading https://repo1.maven.org/maven2/com/twitter/jsr166e/1.1.0/jsr166e-1.1.0.jar ...
        [SUCCESSFUL ] com.twitter#jsr166e;1.1.0!jsr166e.jar (138ms)
downloading https://repo1.maven.org/maven2/io/netty/netty-handler/4.0.33.Final/netty-handler-4.0.33.Final.jar ...
        [SUCCESSFUL ] io.netty#netty-handler;4.0.33.Final!netty-handler.jar (266ms)
downloading https://repo1.maven.org/maven2/io/netty/netty-buffer/4.0.33.Final/netty-buffer-4.0.33.Final.jar ...
        [SUCCESSFUL ] io.netty#netty-buffer;4.0.33.Final!netty-buffer.jar (202ms)
downloading https://repo1.maven.org/maven2/io/netty/netty-transport/4.0.33.Final/netty-transport-4.0.33.Final.jar ...
        [SUCCESSFUL ] io.netty#netty-transport;4.0.33.Final!netty-transport.jar (330ms)
downloading https://repo1.maven.org/maven2/io/netty/netty-codec/4.0.33.Final/netty-codec-4.0.33.Final.jar ...
        [SUCCESSFUL ] io.netty#netty-codec;4.0.33.Final!netty-codec.jar (157ms)
downloading https://repo1.maven.org/maven2/io/netty/netty-common/4.0.33.Final/netty-common-4.0.33.Final.jar ...
        [SUCCESSFUL ] io.netty#netty-common;4.0.33.Final!netty-common.jar (409ms)
downloading https://repo1.maven.org/maven2/org/slf4j/slf4j-api/1.7.7/slf4j-api-1.7.7.jar ...
        [SUCCESSFUL ] org.slf4j#slf4j-api;1.7.7!slf4j-api.jar (57ms)
:: resolution report :: resolve 5827ms :: artifacts dl 7749ms
        :: modules in use:
        com.datastax.cassandra#cassandra-driver-core;3.0.0 from central in [default]
        com.google.guava#guava;16.0.1 from central in [default]
        com.twitter#jsr166e;1.1.0 from central in [default]
        datastax#spark-cassandra-connector;1.6.0-M1-s_2.11 from spark-packages in [default]
        io.dropwizard.metrics#metrics-core;3.1.2 from list in [default]
        io.netty#netty-buffer;4.0.33.Final from central in [default]
        io.netty#netty-codec;4.0.33.Final from central in [default]
        io.netty#netty-common;4.0.33.Final from central in [default]
        io.netty#netty-handler;4.0.33.Final from central in [default]
        io.netty#netty-transport;4.0.33.Final from central in [default]
        joda-time#joda-time;2.3 from central in [default]
        org.apache.cassandra#cassandra-clientutil;3.0.2 from central in [default]
        org.apache.commons#commons-lang3;3.3.2 from list in [default]
        org.joda#joda-convert;1.2 from central in [default]
        org.scala-lang#scala-reflect;2.11.7 from list in [default]
        org.slf4j#slf4j-api;1.7.7 from central in [default]
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   16  |   13  |   13  |   0   ||   16  |   13  |
        ---------------------------------------------------------------------

:: problems summary ::
:::: ERRORS
        unknown resolver sbt-chain

        unknown resolver null

        unknown resolver sbt-chain

        unknown resolver null

        unknown resolver sbt-chain

        unknown resolver null

        unknown resolver sbt-chain

        unknown resolver null

        unknown resolver null

        unknown resolver sbt-chain

        unknown resolver null


:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
:: retrieving :: org.apache.spark#spark-submit-parent
        confs: [default]
        16 artifacts copied, 0 already retrieved (12730kB/549ms)
16/04/08 14:48:19 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.6.1
      /_/

Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_45)
Type in expressions to have them evaluated.
Type :help for more information.
error: bad symbolic reference. A signature in package.class refers to type compileTimeOnly
in package scala.annotation which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling package.class.
<console>:14: error: Reference to value sc should not have survived past type checking,
it should have been processed and eliminated during expansion of an enclosing macro.
                @transient val sc = {
                               ^
<console>:15: error: Reference to method createSQLContext in class SparkILoop should not have survived past type checking,
it should have been processed and eliminated during expansion of an enclosing macro.
                  val _sqlContext = org.apache.spark.repl.Main.interp.createSQLContext()
                                                                      ^
<console>:14: error: Reference to value sqlContext should not have survived past type checking,
it should have been processed and eliminated during expansion of an enclosing macro.
                @transient val sqlContext = {
                               ^
<console>:16: error: not found: value sqlContext
         import sqlContext.implicits._
                ^
<console>:16: error: not found: value sqlContext
         import sqlContext.sql
                ^

scala>

编辑1

当选择正确的scala版本时,它似乎会更进一步,但我不确定下面的输出是否仍然有需要解决的错误:

[idf@node1 bin]$ spark-shell --packages datastax:spark-cassandra-connector:1.6.0-M1-s_2.10
Ivy Default Cache set to: /home/idf/.ivy2/cache
The jars for the packages stored in: /home/idf/.ivy2/jars
:: loading settings :: url = jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apach                                                                  e/ivy/core/settings/ivysettings.xml
datastax#spark-cassandra-connector added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
        confs: [default]
        found datastax#spark-cassandra-connector;1.6.0-M1-s_2.10 in spark-packages
        found org.apache.cassandra#cassandra-clientutil;3.0.2 in central
        found com.datastax.cassandra#cassandra-driver-core;3.0.0 in central
        found io.netty#netty-handler;4.0.33.Final in central
        found io.netty#netty-buffer;4.0.33.Final in central
        found io.netty#netty-common;4.0.33.Final in central
        found io.netty#netty-transport;4.0.33.Final in central
        found io.netty#netty-codec;4.0.33.Final in central
        found io.dropwizard.metrics#metrics-core;3.1.2 in list
        found org.slf4j#slf4j-api;1.7.7 in central
        found org.apache.commons#commons-lang3;3.3.2 in list
        found com.google.guava#guava;16.0.1 in central
        found org.joda#joda-convert;1.2 in central
        found joda-time#joda-time;2.3 in central
        found com.twitter#jsr166e;1.1.0 in central
        found org.scala-lang#scala-reflect;2.10.5 in list
downloading http://dl.bintray.com/spark-packages/maven/datastax/spark-cassandra-connector/1.6.0-M1-s_2.10/spark-cassandr                                                                  a-connector-1.6.0-M1-s_2.10.jar ...
        [SUCCESSFUL ] datastax#spark-cassandra-connector;1.6.0-M1-s_2.10!spark-cassandra-connector.jar (2414ms)
:: resolution report :: resolve 3281ms :: artifacts dl 2430ms
        :: modules in use:
        com.datastax.cassandra#cassandra-driver-core;3.0.0 from central in [default]
        com.google.guava#guava;16.0.1 from central in [default]
        com.twitter#jsr166e;1.1.0 from central in [default]
        datastax#spark-cassandra-connector;1.6.0-M1-s_2.10 from spark-packages in [default]
        io.dropwizard.metrics#metrics-core;3.1.2 from list in [default]
        io.netty#netty-buffer;4.0.33.Final from central in [default]
        io.netty#netty-codec;4.0.33.Final from central in [default]
        io.netty#netty-common;4.0.33.Final from central in [default]
        io.netty#netty-handler;4.0.33.Final from central in [default]
        io.netty#netty-transport;4.0.33.Final from central in [default]
        joda-time#joda-time;2.3 from central in [default]
        org.apache.cassandra#cassandra-clientutil;3.0.2 from central in [default]
        org.apache.commons#commons-lang3;3.3.2 from list in [default]
        org.joda#joda-convert;1.2 from central in [default]
        org.scala-lang#scala-reflect;2.10.5 from list in [default]
        org.slf4j#slf4j-api;1.7.7 from central in [default]
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   16  |   6   |   6   |   0   ||   16  |   1   |
        ---------------------------------------------------------------------

:: problems summary ::
:::: ERRORS
        unknown resolver null

        unknown resolver sbt-chain

        unknown resolver null


:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
:: retrieving :: org.apache.spark#spark-submit-parent
        confs: [default]
        2 artifacts copied, 14 already retrieved (5453kB/69ms)
16/04/08 15:50:20 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java cl                                                                  asses where applicable
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.6.1
      /_/

Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_45)
Type in expressions to have them evaluated.
Type :help for more information.
Spark context available as sc.
16/04/08 15:50:28 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple J                                                                  AR versions of the same plugin in the classpath. The URL "file:/opt/spark-latest/lib/datanucleus-core-3.2.10.jar" is alr                                                                  eady registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-1.6.1-bin-hadoop2.6/                                                                  lib/datanucleus-core-3.2.10.jar."
16/04/08 15:50:28 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont hav                                                                  e multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/spark-latest/lib/datanucleus-rdbms-3.2.9                                                                  .jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-1.6.1-bi                                                                  n-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar."
16/04/08 15:50:28 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have mu                                                                  ltiple JAR versions of the same plugin in the classpath. The URL "file:/opt/spark-latest/lib/datanucleus-api-jdo-3.2.6.j                                                                  ar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-1.6.1-bin-                                                                  hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar."
16/04/08 15:50:45 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
16/04/08 15:50:45 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
16/04/08 15:50:49 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/spark-latest/lib/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar."
16/04/08 15:50:49 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/spark-latest/lib/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar."
16/04/08 15:50:49 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/spark-latest/lib/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar."
16/04/08 15:51:09 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
16/04/08 15:51:09 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
SQL context available as sqlContext.

scala>

共有1个答案

东门奕
2023-03-14

我在使用com.databricks:spark-redshift2.11:2.0.1包时遇到了同样的问题。我的命令是

pyspark --packages com.databricks:spark-redshift_2.11:2.0.1

我发现未知解析器null未知解析器sbt-chain问题发生的最大原因是您的spark版本、scala版本和包版本不一致。所以你需要做的就是找到一个合适的软件包版本。

我的包裹火花红移

groupId: com.databricks
artifactId: spark-redshift_2.10
version: 2.0.1
groupId: com.databricks
artifactId: spark-redshift_2.11
version: 2.0.1
 类似资料:
  • 我有一些Spark经验,但刚开始使用Cassandra。我正在尝试进行非常简单的阅读,但性能非常差——不知道为什么。这是我正在使用的代码: 所有3个参数都是表上键的一部分: 主键(group\u id,epoch,group\u name,auto\u generated\u uuid\u field),聚类顺序为(epoch ASC,group\u name ASC,auto\u generat

  • 每次使用cassandra connector在spark中运行scala程序时都会出现此错误 这是我的程序

  • 我正在使用Spark-Cassandra连接器1.1.0和Cassandra 2.0.12。 谢谢, 沙伊

  • 我已按照以下指南在本地计算机(Windows 10)上安装spark:https://changhsinlee.com/install-pyspark-windows-jupyter/. 从Anaconda启动笔记本并运行时: 它需要很长时间,而且不会完成(至少在60分钟内)。 在此之前,我收到了错误“java-gage-Process-exited-前…”。阅读此内容后:“https://sta

  • 我正在研究建立一个JDBC Spark连接,以便从r/Python使用。我知道和都是可用的,但它们似乎更适合交互式分析,特别是因为它们为用户保留了集群资源。我在考虑一些更类似于Tableau ODBC Spark connection的东西--一些更轻量级的东西(据我所知),用于支持简单的随机访问。虽然这似乎是可能的,而且有一些文档,但(对我来说)JDBC驱动程序的需求是什么并不清楚。 既然Hiv