为什么 "java.lang.ClassNotFoundException: Failed to find data source: kinesis" 具有 spark-streaming-kinesis-asl 依赖性?

Why "java.lang.ClassNotFoundException: Failed to find data source: kinesis" with spark-streaming-kinesis-asl dependency?

我的设置:

  scala:2.11.8
  spark:2.3.0.cloudera4

我已经在我的 .pom 文件中添加了:

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-streaming-kinesis-asl_2.11</artifactId>
  <version>2.3.0</version>
</dependency>

但是,当我 运行 我的 spark 流代码使用来自 kinesis 的数据时,它 returns:

Exception in thread "main" java.lang.ClassNotFoundException: Failed to find data source: kinesis.

我在使用来自Kafka的数据时遇到了类似的错误,并通过在提交命令中指明依赖的jar 解决了它。不过这次好像不行了:

sudo -u hdfs spark2-submit --packages org.apache.spark:spark-streaming-kinesis-asl_2.11:2.3.0 --class com.package.newkinesis --master yarn  sparktest-1.0-SNAPSHOT.jar 

如何解决这个问题?感谢您的帮助。

我的代码:

val spark = SparkSession
      .builder.master("local[4]")
      .appName("SpeedTester")
      .config("spark.driver.memory", "3g")
      .getOrCreate()

    val kinesis = spark.readStream
      .format("kinesis")
      .option("streamName", kinesisStreamName)
      .option("endpointUrl", kinesisEndpointUrl)
      .option("initialPosition", "TRIM_HORIZON")
      .option("awsAccessKey", awsAccessKeyId)
      .option("awsSecretKey", awsSecretKey)
      .load()

    kinesis.writeStream.format("console").start().awaitTermination()

我的完整 .pom 文件:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.netease</groupId>
  <artifactId>sparktest</artifactId>
  <version>1.0-SNAPSHOT</version>
  <inceptionYear>2008</inceptionYear>
  <properties>
    <scala.version>2.11.8</scala.version>
  </properties>
    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>3.2.1</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                        <configuration>
                            <includes>
                                <include>org/apache/spark/*</include>
                            </includes>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>

  <dependencies>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.11</artifactId>
        <scope>provided</scope>
      <version>2.3.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming_2.11</artifactId>
        <scope>provided</scope>
      <version>2.3.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-sql_2.11</artifactId>
        <scope>provided</scope>
      <version>2.3.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
      <version>2.3.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.kafka</groupId>
      <artifactId>kafka-clients</artifactId>
      <version>2.1.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming-kinesis-asl_2.11</artifactId>
      <version>2.3.0</version>
    </dependency>
  </dependencies>
</project>

tl;dr 不行。

您使用 spark-streaming-kinesis-asl_2.11 旧版 Spark Streaming 的依赖项 API 和新的 Spark Structured Streaming,因此例外。

您必须为 AWS Kinesis 找到一个兼容的 Spark Structured Streaming 数据源,它不受 Apache Spark 项目的正式支持。