为什么 Spark 应用程序的 sbt 程序集会导致 "Modules were resolved with conflicting cross-version suffixes"?

Why does sbt assembly of a Spark application lead to "Modules were resolved with conflicting cross-version suffixes"?

我正在将 CDH 集群与 Spark 2.1 和 Scala 2.11.8 一起使用。

我使用 sbt 1.0.2.

在执行 assembly 时,出现错误

[error] java.lang.RuntimeException: Conflicting cross-version suffixes in: org.scala-lang.modules:scala-xml, org.scala-lang.modules:scala-parser-combinators

我尝试使用 dependencyOverridesforce() 来解决版本不匹配问题,但均无效。

来自 sbt 程序集的错误消息

[error] Modules were resolved with conflicting cross-version suffixes in {file:/D:/Tools/scala_ide/test_workspace/test/NewSp
arkTest/}newsparktest:
[error]    org.scala-lang.modules:scala-xml _2.11, _2.12
[error]    org.scala-lang.modules:scala-parser-combinators _2.11, _2.12
[error] java.lang.RuntimeException: Conflicting cross-version suffixes in: org.scala-lang.modules:scala-xml, org.scala-lang.
modules:scala-parser-combinators
[error]         at scala.sys.package$.error(package.scala:27)
[error]         at sbt.librarymanagement.ConflictWarning$.processCrossVersioned(ConflictWarning.scala:39)
[error]         at sbt.librarymanagement.ConflictWarning$.apply(ConflictWarning.scala:19)
[error]         at sbt.Classpaths$.$anonfun$ivyBaseSettings(Defaults.scala:1971)
[error]         at scala.Function1.$anonfun$compose(Function1.scala:44)
[error]         at sbt.internal.util.$tilde$greater.$anonfun$$u2219(TypeFunctions.scala:42)
[error]         at sbt.std.Transform$$anon.work(System.scala:64)
[error]         at sbt.Execute.$anonfun$submit(Execute.scala:257)
[error]         at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error]         at sbt.Execute.work(Execute.scala:266)
[error]         at sbt.Execute.$anonfun$submit(Execute.scala:257)
[error]         at sbt.ConcurrentRestrictions$$anon.$anonfun$submitValid(ConcurrentRestrictions.scala:167)
[error]         at sbt.CompletionService$$anon.call(CompletionService.scala:32)
[error]         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error]         at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error]         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error]         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[error]         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[error]         at java.lang.Thread.run(Thread.java:748)
[error] (*:update) Conflicting cross-version suffixes in: org.scala-lang.modules:scala-xml, org.scala-lang.modules:scala-par
ser-combinators
[error] Total time: 413 s, completed Oct 12, 2017 3:28:02 AM

build.sbt

name := "newtest"
version := "0.0.2"

scalaVersion := "2.11.8" 

sbtPlugin := true

val sparkVersion = "2.1.0"

mainClass in (Compile, run) := Some("com.testpackage.sq.newsparktest")

assemblyJarName in assembly := "newtest.jar"


libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.11" % "2.1.0" % "provided",
  "org.apache.spark" % "spark-sql_2.11" % "2.1.0" % "provided",
  "com.databricks" % "spark-avro_2.11" % "3.2.0",
  "org.apache.spark" % "spark-hive_2.11" % "2.1.0" % "provided")


libraryDependencies +=
     "log4j" % "log4j" % "1.2.15" excludeAll(
       ExclusionRule(organization = "com.sun.jdmk"),
       ExclusionRule(organization = "com.sun.jmx"),
       ExclusionRule(organization = "javax.jms")
     )

resolvers += "SparkPackages" at "https://dl.bintray.com/spark-packages/maven/"
resolvers += Resolver.url("bintray-sbt-plugins", url("http://dl.bintray.com/sbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)

assemblyMergeStrategy in assembly := {
  case PathList("META-INF", xs @ _*) => MergeStrategy.discard
  case x => MergeStrategy.first
}

plugins.sbt

dependencyOverrides += ("org.scala-lang.modules" % "scala-xml_2.11" % "1.0.4")
dependencyOverrides += ("org.scala-lang.modules" % "scala-parser-combinators_2.11" % "1.0.4")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.5")
resolvers += Resolver.url("bintray-sbt-plugins", url("https://dl.bintray.com/eed3si9n/sbt-plugins/"))(Resolver.ivyStylePatterns)

tl;drbuild.sbt 中删除 sbtPlugin := true(这是针对 sbt 插件而不是应用程序)。

您还应该从 plugins.sbt 中删除 dependencyOverrides

您应该将 spark-core_2.11libraryDependencies 中的其他 Spark 依赖项更改如下:

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.0" % "provided"

更改是使用 %%(= 两个百分号)并从依赖项的中间部分删除 Scala 的版本,例如spark-core 以上。