为什么 spark-xml 因 NoSuchMethodError 与 Spark 2.0.0 依赖关系而失败?

Why does spark-xml fail with NoSuchMethodError with Spark 2.0.0 dependency?

嗨,我是 Scala 和 Intellij 的新手,我只是想在 Scala 上做这个:

import org.apache.spark
import org.apache.spark.sql.SQLContext
import com.databricks.spark.xml.XmlReader


object SparkSample {
  def main(args: Array[String]): Unit = {
    val conf = new spark.SparkConf()
    conf.setAppName("Datasets Test")
    conf.setMaster("local[2]")
    val sc = new spark.SparkContext(conf)

    val sqlContext = new SQLContext(sc)
    val df = sqlContext.read
      .format("com.databricks.spark.xml")
      .option("rowTag", "shop")
      .load("shops.xml") /* NoSuchMethod error here */

    val selectedData = df.select("author", "_id")
  df.show
}

基本上我正在尝试将 XML 转换为 spark 数据帧 我在 '.load("shops.xml")' 中收到 NoSuchMethod 错误 下面是 SBT

version := "0.1"

scalaVersion := "2.11.3"
val sparkVersion = "2.0.0" 
val sparkXMLVersion = "0.3.3"

libraryDependencies ++= Seq(
  "org.apache.spark"      %%  "spark-core"      %   sparkVersion  exclude("jline", "2.12"),
  "org.apache.spark"      %% "spark-sql"        % sparkVersion excludeAll(ExclusionRule(organization = "jline"),ExclusionRule("name","2.12")),
   "com.databricks"        %% "spark-xml"        % sparkXMLVersion,
)

下面是痕迹:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.types.DecimalType$.Unlimited()Lorg/apache/spark/sql/types/DecimalType;
at com.databricks.spark.xml.util.InferSchema$.<init>(InferSchema.scala:50)
at com.databricks.spark.xml.util.InferSchema$.<clinit>(InferSchema.scala)
at com.databricks.spark.xml.XmlRelation$$anonfun.apply(XmlRelation.scala:46)
at com.databricks.spark.xml.XmlRelation$$anonfun.apply(XmlRelation.scala:46)
at scala.Option.getOrElse(Option.scala:120)
at com.databricks.spark.xml.XmlRelation.<init>(XmlRelation.scala:45)
at com.databricks.spark.xml.DefaultSource.createRelation(DefaultSource.scala:66)
at com.databricks.spark.xml.DefaultSource.createRelation(DefaultSource.scala:44)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:315)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:149)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:132)

有人可以指出错误吗?对我来说似乎是一个依赖性问题。 spark-core 似乎工作正常但不是 spark-sql 我之前用过scala 2.12,后来改成2.11,因为spark-core没有解决

tl;dr 我认为这是 Scala 版本不匹配的问题。使用spark-xml 0.4.1.

引用 spark-xml 的 Requirements(突出显示我的):

This library requires Spark 2.0+ for 0.4.x.

For version that works with Spark 1.x, please check for branch-0.3.

这对我来说 spark-xml 0.3.3 可与 Spark 1.x 一起使用(不是您要求的 Spark 2.0.0)。