编译时导致 scala.MatchError 的 assemblyMergeStrategy
assemblyMergeStrategy causing scala.MatchError when compiling
我是 sbt/assembly 的新手。我正在尝试解决一些依赖性问题,似乎唯一的方法是通过自定义合并策略。但是,每当我尝试添加合并策略时,我都会在编译时得到一个看似随机的 MatchError:
[error] (*:assembly) scala.MatchError: org/apache/spark/streaming/kafka/KafkaUtilsPythonHelper$$anonfun.class (of class java.lang.String)
我显示的是 kafka 库的匹配错误,但是如果我完全删除那个库,我会在另一个库上得到 MatchError。如果我取出所有库,我会在自己的代码中得到 MatchError。 None 如果我取出 "assemblyMergeStrategy" 块,就会发生这种情况。我显然遗漏了一些非常基本的东西,但对于我的一生我找不到它,我也找不到其他有这个问题的人。我已经尝试过旧的 mergeStrategy 语法,但据我从文档和 SO 中读到的,这是现在编写它的正确方法。请帮忙?
这是我的 project/assembly.sbt:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.3")
还有我的 project.sbt 文件:
name := "Clerk"
version := "1.0"
scalaVersion := "2.11.6"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.6.1" % "provided",
"org.apache.spark" %% "spark-sql" % "1.6.1" % "provided",
"org.apache.spark" %% "spark-streaming" % "1.6.1" % "provided",
"org.apache.kafka" %% "kafka" % "0.8.2.1",
"ch.qos.logback" % "logback-classic" % "1.1.7",
"net.logstash.logback" % "logstash-logback-encoder" % "4.6",
"com.typesafe.scala-logging" %% "scala-logging" % "3.1.0",
"org.apache.spark" %% "spark-streaming-kafka" % "1.6.1",
("org.apache.spark" %% "spark-streaming-kafka" % "1.6.1").
exclude("org.spark-project.spark", "unused")
)
assemblyMergeStrategy in assembly := {
case PathList("org.slf4j", "impl", xs @ _*) => MergeStrategy.first
}
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
您缺少合并策略模式匹配的默认情况:
assemblyMergeStrategy in assembly := {
case PathList("org.slf4j", "impl", xs @ _*) => MergeStrategy.first
case x =>
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(x)
}
我是 sbt/assembly 的新手。我正在尝试解决一些依赖性问题,似乎唯一的方法是通过自定义合并策略。但是,每当我尝试添加合并策略时,我都会在编译时得到一个看似随机的 MatchError:
[error] (*:assembly) scala.MatchError: org/apache/spark/streaming/kafka/KafkaUtilsPythonHelper$$anonfun.class (of class java.lang.String)
我显示的是 kafka 库的匹配错误,但是如果我完全删除那个库,我会在另一个库上得到 MatchError。如果我取出所有库,我会在自己的代码中得到 MatchError。 None 如果我取出 "assemblyMergeStrategy" 块,就会发生这种情况。我显然遗漏了一些非常基本的东西,但对于我的一生我找不到它,我也找不到其他有这个问题的人。我已经尝试过旧的 mergeStrategy 语法,但据我从文档和 SO 中读到的,这是现在编写它的正确方法。请帮忙?
这是我的 project/assembly.sbt:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.3")
还有我的 project.sbt 文件:
name := "Clerk"
version := "1.0"
scalaVersion := "2.11.6"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.6.1" % "provided",
"org.apache.spark" %% "spark-sql" % "1.6.1" % "provided",
"org.apache.spark" %% "spark-streaming" % "1.6.1" % "provided",
"org.apache.kafka" %% "kafka" % "0.8.2.1",
"ch.qos.logback" % "logback-classic" % "1.1.7",
"net.logstash.logback" % "logstash-logback-encoder" % "4.6",
"com.typesafe.scala-logging" %% "scala-logging" % "3.1.0",
"org.apache.spark" %% "spark-streaming-kafka" % "1.6.1",
("org.apache.spark" %% "spark-streaming-kafka" % "1.6.1").
exclude("org.spark-project.spark", "unused")
)
assemblyMergeStrategy in assembly := {
case PathList("org.slf4j", "impl", xs @ _*) => MergeStrategy.first
}
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
您缺少合并策略模式匹配的默认情况:
assemblyMergeStrategy in assembly := {
case PathList("org.slf4j", "impl", xs @ _*) => MergeStrategy.first
case x =>
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(x)
}