为什么 JavaNGramExample 会因 "java.lang.ClassNotFoundException: scala.collection.GenTraversableOnce$class" 而失败?

Why would JavaNGramExample fail with "java.lang.ClassNotFoundException: scala.collection.GenTraversableOnce$class"?

我正在 spark 中尝试一个简单的 NGram 示例

https://github.com/apache/spark/blob/master/examples/src/main/java/org/apache/spark/examples/ml/JavaNGramExample.java

这是我的 pom 依赖项

<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.2.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.10</artifactId>
        <version>2.2.0</version>
    </dependency>
</dependencies>

下面是示例代码

public class App {
    public static void main(String[] args) {
        System.out.println("Hello World!");

        System.setProperty("hadoop.home.dir", "D:\del");

         SparkSession spark = SparkSession
                  .builder()
                  .appName("JavaNGramExample").config("spark.master", "local")
                  .getOrCreate();


         List<Row> data = Arrays.asList(RowFactory.create(0, Arrays.asList("car", "killed", "cat")),
                    RowFactory.create(1, Arrays.asList("train", "killed", "cat")),
                    RowFactory.create(2, Arrays.asList("john", "plays", "cricket")),
                    RowFactory.create(3, Arrays.asList("tom", "likes", "mangoes")));


        StructType schema = new StructType(new StructField[] {
                new StructField("id", DataTypes.IntegerType, false, Metadata.empty()),
                new StructField("words", DataTypes.createArrayType(DataTypes.StringType), false, Metadata.empty()) });

        Dataset<Row> wordDataFrame = spark.createDataFrame(data, schema);

        NGram ngramTransformer = new NGram().setN(2).setInputCol("words").setOutputCol("ngrams");

        Dataset<Row> ngramDataFrame = ngramTransformer.transform(wordDataFrame);
        System.out.println(" DISPLAY NGRAMS ");
        ngramDataFrame.select("ngrams").show(false);


    }
}

我在 运行 这段代码时遇到以下错误。

Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
    at org.apache.spark.sql.types.StructType.<init>(StructType.scala:98)
    at com.mypackage.spark.learnspark.App.main(App.java:61)
Caused by: java.lang.ClassNotFoundException: scala.collection.GenTraversableOnce$class
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 2 more

查了一下scala的依赖是scala-library-2.11.8

spark 2.2.0 和我的 scala jar 之间是否存在任何不一致?

tl;drspark-mllib_2.10 更改为 spark-mllib_2.11 因此 Scala 2.11.8 用于 Spark MLlib 依赖项(并可选择删除 spark-core_2.11依赖)。


看看你的 pom.xml:

<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.2.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.10</artifactId>
        <version>2.2.0</version>
    </dependency>
</dependencies>
  1. spark-core_2.11 来自 Spark 2.2.0 依赖于 Scala 2.11.8 没关系。

  2. 来自 Spark 2.2.0
  3. spark-mllib_2.10 依赖于两个不同且不兼容的 Scala 版本 2.10.x2.11.8。这就是问题的根本原因。

确保使用:

  1. 您的 Spark 依赖项的 artifactId 的相同后缀,即 spark-core_2.11spark-mllib_2.11(请注意,我将其更改为 2.11)。

  2. 在每个 Spark 依赖项中都相同 version