将 scopt OptionParser 与 Spark 一起使用时出现 NoClassDefFoundError
NoClassDefFoundError while using scopt OptionParser with Spark
我使用的是 Apache Spark 1.2.1 版和 Scala 2.10.4 版。我正在尝试获取示例 MovieLensALS working. However, I am running into errors with scopt 库,这是代码中的一个要求。任何帮助,将不胜感激。
我的build.sbt如下:
name := "Movie Recommender System"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.1"
libraryDependencies += "org.apache.spark" %% "spark-graphx" % "1.2.1"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "1.2.1"
libraryDependencies += "com.github.scopt" %% "scopt" % "3.2.0"
resolvers += Resolver.sonatypeRepo("public")
我得到的错误如下:
Exception in thread "main" java.lang.NoClassDefFoundError: scopt/OptionParser
at MovieLensALS.main(MovieLensALS.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: scopt.OptionParser
at java.net.URLClassLoader.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
... 8 more
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
在 运行 sbt 程序集构建 jar 时,我收到以下错误:
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: assembly
[error] assembly
[error] ^
编辑:根据 Justin Piphony 的建议,sbt GitHub 页面中列出的解决方案有助于修复此错误。基本上在项目/目录中创建一个文件 assembly.sbt 并添加行
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")
注意版本要根据使用的版本添加
您需要将 scopt
打包到您的 jar 中。 sbt 默认不这样做。要创建这个 fat jar,您需要使用 sbt-assembly
如果你使用maven打包你的spark项目,你需要添加maven-assembly-plugin
帮助打包依赖的插件:
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.5</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<!-- this is used for inheritance merges -->
<phase>package</phase>
<!-- bind to the packaging phase -->
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
我使用的是 Apache Spark 1.2.1 版和 Scala 2.10.4 版。我正在尝试获取示例 MovieLensALS working. However, I am running into errors with scopt 库,这是代码中的一个要求。任何帮助,将不胜感激。 我的build.sbt如下:
name := "Movie Recommender System"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.1"
libraryDependencies += "org.apache.spark" %% "spark-graphx" % "1.2.1"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "1.2.1"
libraryDependencies += "com.github.scopt" %% "scopt" % "3.2.0"
resolvers += Resolver.sonatypeRepo("public")
我得到的错误如下:
Exception in thread "main" java.lang.NoClassDefFoundError: scopt/OptionParser
at MovieLensALS.main(MovieLensALS.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: scopt.OptionParser
at java.net.URLClassLoader.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
... 8 more
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
在 运行 sbt 程序集构建 jar 时,我收到以下错误:
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: assembly
[error] assembly
[error] ^
编辑:根据 Justin Piphony 的建议,sbt GitHub 页面中列出的解决方案有助于修复此错误。基本上在项目/目录中创建一个文件 assembly.sbt 并添加行
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")
注意版本要根据使用的版本添加
您需要将 scopt
打包到您的 jar 中。 sbt 默认不这样做。要创建这个 fat jar,您需要使用 sbt-assembly
如果你使用maven打包你的spark项目,你需要添加maven-assembly-plugin
帮助打包依赖的插件:
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.5</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<!-- this is used for inheritance merges -->
<phase>package</phase>
<!-- bind to the packaging phase -->
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>