Mahout 0.11.1 Spark-Shell NoClassDefFoundError
Mahout 0.11.1 Spark-Shell NoClassDefFoundError
我正在尝试在 Cloudera QuickStart VM
上将 Mahout Spark-Shell 升级为 运行
Mahout: Version 0.11.1
Spark: Version 1.5.0-cdh5.5.1
Java: 1.7.0_67
.bashrc 设置为
export MAHOUT_HOME=/home/cloudera/Desktop/Mahout_0_11_1
export MAHOUT_LOCAL=true
export SPARK_HOME=/usr/lib/spark
export JAVA_HOME=/usr/java/jdk1.7.0_67-cloudera
当我 运行 Mahout Spark-Shell 时,我收到以下错误消息。
java.lang.NoClassDefFoundError: com/sun/jersey/spi/container/servlet/ServletContainer
at org.apache.spark.status.api.v1.ApiRootResource$.getServletHandler(ApiRootResource.scala:187)
at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:68)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:74)
at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:190)
at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:141)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:466)
at org.apache.mahout.sparkbindings.package$.mahoutSparkContext(package.scala:91)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:89)
...
接下来是:
Mahout distributed context is available as "implicit val sdc".
java.lang.NullPointerException
at org.apache.spark.sql.execution.ui.SQLListener.<init>(SQLListener.scala:34)
at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:77)
at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1033)
at $iwC$$iwC.<init>(<console>:11)
at $iwC.<init>(<console>:19)
在火花-env.sh,
添加
导出 SPARK_DIST_CLASSPATH=$(/path/to/hadoop/bin/hadoop 类路径)
并确保 jersey-servlet-1.9.jar 在类路径中。
检查所有 *-env.sh 脚本并尽可能明确地设置环境变量,检查每个,然后检查日志是否有错误。
光盘/
寻找 。 -name jersey-servlet-1.9.jar
并确保找到该文件的路径在您的类路径中
编辑:
添加 jersey-server-1.9.jar 到 $MAHOUT_HOME/lib/ 目录.
我正在尝试在 Cloudera QuickStart VM
上将 Mahout Spark-Shell 升级为 运行Mahout: Version 0.11.1
Spark: Version 1.5.0-cdh5.5.1
Java: 1.7.0_67
.bashrc 设置为
export MAHOUT_HOME=/home/cloudera/Desktop/Mahout_0_11_1
export MAHOUT_LOCAL=true
export SPARK_HOME=/usr/lib/spark
export JAVA_HOME=/usr/java/jdk1.7.0_67-cloudera
当我 运行 Mahout Spark-Shell 时,我收到以下错误消息。
java.lang.NoClassDefFoundError: com/sun/jersey/spi/container/servlet/ServletContainer
at org.apache.spark.status.api.v1.ApiRootResource$.getServletHandler(ApiRootResource.scala:187)
at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:68)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:74)
at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:190)
at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:141)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:466)
at org.apache.mahout.sparkbindings.package$.mahoutSparkContext(package.scala:91)
at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:89)
...
接下来是:
Mahout distributed context is available as "implicit val sdc".
java.lang.NullPointerException
at org.apache.spark.sql.execution.ui.SQLListener.<init>(SQLListener.scala:34)
at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:77)
at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1033)
at $iwC$$iwC.<init>(<console>:11)
at $iwC.<init>(<console>:19)
在火花-env.sh,
添加
导出 SPARK_DIST_CLASSPATH=$(/path/to/hadoop/bin/hadoop 类路径)
并确保 jersey-servlet-1.9.jar 在类路径中。
检查所有 *-env.sh 脚本并尽可能明确地设置环境变量,检查每个,然后检查日志是否有错误。
光盘/ 寻找 。 -name jersey-servlet-1.9.jar 并确保找到该文件的路径在您的类路径中
编辑: 添加 jersey-server-1.9.jar 到 $MAHOUT_HOME/lib/ 目录.