pom.xml 使用 scala 2.12.10 时 spark 的依赖项
pom.xml dependencies for spark while using scala 2.12.10
这些 Apache Spark 依赖项在使用 scala 2.12.10 时不起作用
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.12.10</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>3.0.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>3.0.1</version>
</dependency>
</dependencies>
运行 来自 IntelliJ 的 spark 应用程序出错
Exception in thread "main" java.lang.NoSuchMethodError:
scala.Product.$init$(Lscala/Product;)V at
org.apache.spark.SparkConf$DeprecatedConfig.(SparkConf.scala:784)
at org.apache.spark.SparkConf$.(SparkConf.scala:605) at
org.apache.spark.SparkConf$.(SparkConf.scala) at
org.apache.spark.SparkConf.set(SparkConf.scala:94) at
org.apache.spark.SparkConf.set(SparkConf.scala:83) at
org.apache.spark.SparkConf.setMaster(SparkConf.scala:115) at
org.apache.spark.SparkContext$.updatedConf(SparkContext.scala:2717)
at org.apache.spark.SparkContext.(SparkContext.scala:153)
但是,这组依赖项与同一个 spark 应用程序完美配合。
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.8</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.4.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.4.1</version>
</dependency>
代码片段-
import org.apache.spark.SparkContext
import org.apache.log4j.{Level, Logger}
object Testing1
{
def main(args : Array[String]): Unit = {
Logger.getLogger("org").setLevel(Level.OFF)
val sc = new SparkContext("local[*]" , "SparkDemo")
val lines = sc.textFile("sample.txt");
val words = lines.flatMap(line => line.split(' '))
val wordsKVRdd = words.map(x => (x,1))
val count = wordsKVRdd.reduceByKey((x,y) => x + y).map(x => (x._2,x._1)).sortByKey(false).map(x => (x._2, x._1)).take(10)
count.foreach(println)
}
}
此错误说明 Scala 版本不兼容。您要么有另一个依赖于 Scala 2.11 的依赖项,要么您只需要执行 mvn clean
来摆脱使用 Scala 2.11 编译的旧 类。还要检查项目设置中配置的 Scala 版本。
我在IntelliJ的模块设置中添加了scala 2.12.10 sdk后就开始工作了。另外,我从 module/project 设置中删除了 scala 2.11.8 sdk。
这些 Apache Spark 依赖项在使用 scala 2.12.10 时不起作用
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.12.10</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>3.0.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>3.0.1</version>
</dependency>
</dependencies>
运行 来自 IntelliJ 的 spark 应用程序出错
Exception in thread "main" java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V at org.apache.spark.SparkConf$DeprecatedConfig.(SparkConf.scala:784) at org.apache.spark.SparkConf$.(SparkConf.scala:605) at org.apache.spark.SparkConf$.(SparkConf.scala) at org.apache.spark.SparkConf.set(SparkConf.scala:94) at org.apache.spark.SparkConf.set(SparkConf.scala:83) at org.apache.spark.SparkConf.setMaster(SparkConf.scala:115) at org.apache.spark.SparkContext$.updatedConf(SparkContext.scala:2717) at org.apache.spark.SparkContext.(SparkContext.scala:153)
但是,这组依赖项与同一个 spark 应用程序完美配合。
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.8</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.4.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.4.1</version>
</dependency>
代码片段-
import org.apache.spark.SparkContext
import org.apache.log4j.{Level, Logger}
object Testing1
{
def main(args : Array[String]): Unit = {
Logger.getLogger("org").setLevel(Level.OFF)
val sc = new SparkContext("local[*]" , "SparkDemo")
val lines = sc.textFile("sample.txt");
val words = lines.flatMap(line => line.split(' '))
val wordsKVRdd = words.map(x => (x,1))
val count = wordsKVRdd.reduceByKey((x,y) => x + y).map(x => (x._2,x._1)).sortByKey(false).map(x => (x._2, x._1)).take(10)
count.foreach(println)
}
}
此错误说明 Scala 版本不兼容。您要么有另一个依赖于 Scala 2.11 的依赖项,要么您只需要执行 mvn clean
来摆脱使用 Scala 2.11 编译的旧 类。还要检查项目设置中配置的 Scala 版本。
我在IntelliJ的模块设置中添加了scala 2.12.10 sdk后就开始工作了。另外,我从 module/project 设置中删除了 scala 2.11.8 sdk。