NoClassDefFoundError: org/apache/spark/sql/DataFrame in spark-cassandra-connector

NoClassDefFoundError: org/apache/spark/sql/DataFrame in spark-cassandra-connector

我正在尝试将 spark-cassandra-connector1.4 升级到 1.5

一切似乎都很好,但是当我 运行 测试用例时,它卡在了进程之间并记录了一些错误消息:

Exception in thread "dag-scheduler-event-loop" java.lang.NoClassDefFoundError: org/apache/spark/sql/DataFrame

我的 pom 文件如下所示:

<dependencies>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>3.8.1</version>
      <scope>test</scope>
    </dependency>
    <!-- https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector_2.10 -->
<dependency>
    <groupId>com.datastax.spark</groupId>
    <artifactId>spark-cassandra-connector_2.10</artifactId>
    <version>1.5.0</version>
    </dependency>
        <dependency>
        <groupId>com.google.guava</groupId>
        <artifactId>guava</artifactId>
        <version>16.0.1</version>
    </dependency>
    <!-- Scala Library -->
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>2.10.5</version>
    </dependency>
    <!--Spark Cassandra Connector-->
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector_2.10</artifactId>
        <version>1.5.0</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector-java_2.10</artifactId>
        <version>1.5.0</version>
    </dependency>
    <dependency>
      <groupId>com.datastax.cassandra</groupId>
      <artifactId>cassandra-driver-core</artifactId>
      <version>3.0.2</version>
    </dependency>
    <!--Spark-->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.5.0</version>
        <exclusions>
          <exclusion>
            <groupId>net.java.dev.jets3t</groupId>
            <artifactId>jets3t</artifactId>
          </exclusion>
        </exclusions>
    </dependency>
  </dependencies>
</project>

提前致谢!!

谁能帮我解决这个问题? 如果您需要更多信息,请告诉我!!

尝试添加依赖

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.10</artifactId>
    <version>${spark.version}</version>
    <scope>provided</scope>
</dependency>

还要确保您的版本 spark-cassandra-connector 与您使用的 Spark 版本兼容。当我尝试将较旧的 spark-cassandra-connector 与较新的 Spark 版本一起使用时,即使具有所有正确的依赖项,我也会收到相同的错误消息。参考这个table:https://github.com/datastax/spark-cassandra-connector#version-compatibility