Scala Spark 2.0 代码的运行时错误

Runtime error on Scala Spark 2.0 code

我有以下代码:

import org.apache.spark.sql.SparkSession
        .
        .
        .
    val spark = SparkSession
      .builder()
      .appName("PTAMachineLearner")
      .getOrCreate()

执行时,出现以下错误:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
    at org.apache.spark.sql.SparkSession$Builder.config(SparkSession.scala:750)
    at org.apache.spark.sql.SparkSession$Builder.appName(SparkSession.scala:741)
    at com.acme.pta.accuracy.ml.PTAMachineLearnerModel.getDF(PTAMachineLearnerModel.scala:52)

代码编译和构建都很好。以下是依赖项:

scalaVersion := "2.11.11"
libraryDependencies ++= Seq(
  // Spark dependencies
  "org.apache.spark" %% "spark-hive" % "2.1.1",
  "org.apache.spark" %% "spark-mllib" % "2.1.1",
  // Third-party libraries
  "net.sf.jopt-simple" % "jopt-simple" % "5.0.3",
  "com.amazonaws" % "aws-java-sdk" % "1.3.11",
  "org.apache.logging.log4j" % "log4j-api" % "2.8.2",
  "org.apache.logging.log4j" % "log4j-core" % "2.8.2",
  "org.apache.logging.log4j" %% "log4j-api-scala" % "2.8.2",
  "com.typesafe.play" %% "play-ahc-ws-standalone" % "1.0.0-M9",
  "net.liftweb" % "lift-json_2.11" % "3.0.1"
)

我正在执行这样的代码:

/Users/paulreiners/spark-2.1.1-bin-hadoop2.7/bin/spark-submit \
      --class "com.acme.pta.accuracy.ml.CreateRandomForestRegressionModel" \
      --master local[4] \
      target/scala-2.11/acme-pta-accuracy-ocean.jar \

我拥有这一切 运行 Spark 1.6。我正在尝试升级到 Spark 2,但缺少一些东西。

class ArrowAssoc 确实存在于您的 Scala 库中。看到 this Scala doc . But you are getting error in Spark library. So obviously, Spark version you are using is not compatible with Scala ver 2.11 as it is probably compiled with older Scala version. If you see this older Scala API doc , ArrowSpec 发生了很大变化。例如它现在是隐式的,有很多隐式依赖。确保您的 Spark 和 Scala 版本兼容。

我发现了问题。我的系统上安装了 Scala 2.10.5。所以 sbt 或 spark-submit 正在调用它并期待 2.11.11.

我遇到了同样的问题。但是,就我而言,问题是我将 jar 部署在 Spark1.x 集群中,因为代码是在 Spark2.x.

中编写的

因此,如果您看到此错误,只需根据各自安装的版本检查代码中使用的 spark 和 scala 版本。