sbt-assembly 不包括依赖项

sbt-assembly not including dependencies

我正在尝试构建一个 fat jar 以使用 sbt 程序集发送到 spark-submit。但是,我似乎无法正确构建构建过程。

我现在的build.sbt如下

name := "MyAppName"

version := "1.0"

scalaVersion := "2.10.6"


libraryDependencies  ++= Seq(
  "org.apache.spark" %% "spark-core" % "1.6.0" % "provided",
  "org.apache.spark" %% "spark-mllib" % "1.6.0" % "provided",
  "org.scalanlp" %% "breeze" % "0.12",
  "org.scalanlp" %% "breeze-natives" % "0.12"
)

resolvers ++= Seq(
  "Sonatype Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots/"
)

运行 sbt-sssembly 生成一个 jar。但是,在将 jar 提交到 spark-submit 之后 spark-submit MyAppName-assembly-1.0.jar(已经指定了 main class,所以我假设它没问题,我没有指定 class),抛出以下异常:

java.lang.NoSuchMethodError: breeze.linalg.DenseVector.noOffsetOrStride()Z
at breeze.linalg.DenseVector$canDotD$.apply(DenseVector.scala:629)
at breeze.linalg.DenseVector$canDotD$.apply(DenseVector.scala:626)
at breeze.linalg.ImmutableNumericOps$class.dot(NumericOps.scala:98)
at breeze.linalg.DenseVector.dot(DenseVector.scala:50)
at RunMe$.cosSimilarity(RunMe.scala:103)
at RunMe$$anonfun.apply(RunMe.scala:35)
at RunMe$$anonfun.apply(RunMe.scala:33)
at scala.collection.Iterator$$anon.next(Iterator.scala:328)
at scala.collection.convert.Wrappers$IteratorWrapper.next(Wrappers.scala:30)
at org.spark-project.guava.collect.Ordering.leastOf(Ordering.java:658)
at org.apache.spark.util.collection.Utils$.takeOrdered(Utils.scala:37)
at org.apache.spark.rdd.RDD$$anonfun$takeOrdered$$anonfun.apply(RDD.scala:1377)
at org.apache.spark.rdd.RDD$$anonfun$takeOrdered$$anonfun.apply(RDD.scala:1374)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$$anonfun$apply.apply(RDD.scala:710)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$$anonfun$apply.apply(RDD.scala:710)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

我对 scala 和 sbt 的世界还比较陌生,所以非常感谢任何帮助!

我遇到了类似的问题。我最终将 jar 保存在 lib 目录下,然后在 assembly.sbt 添加:

unmanagedJars in Compile += file("lib/my.jar")

原来问题是 breeze 已经包含在 spark 中了。问题是 spark 包含一个较新的 Breeze 版本,其中包含我的版本没有的方法。

我的参考:Apache Spark - java.lang.NoSuchMethodError: breeze.linalg.DenseVector