在 运行 Spark 应用程序中获取 java.lang.ClassNotFoundException
Getting java.lang.ClassNotFoundException while running Spark Application
我是 Spark (Scala) 的新手,我正在尝试通过 spark 提交 运行 一个 spark 应用程序。不幸的是,我遇到了 java.lang.ClassNotFoundException
异常。
这是我的 spark 提交命令:
./spark-submit --class "spark.phoenix.a" --master local --deploy-mode client /home/ec2-user/phoenix-0.0.1-SNAPSHOT.jar
这是我的例外:
java.lang.ClassNotFoundException: spark.phoenix.a
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:175)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:689)
at org.apache.spark.deploy.SparkSubmit$.doRunMain(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
这是我的示例代码:
package spark.phoenix
import org.apache.spark.SparkConf
import org.apache.spark.sql.SQLContext
object a {
def main(args: Array[String]) {
import org.apache.phoenix.spark._
import org.apache.spark.SparkContext
val zkQuorum = "localhost:2181"
val master = "local[*]"
val sparkConf = new SparkConf()
sparkConf.setAppName("phoenix-spark-save")
.setMaster(s"${master}")
val sc = new SparkContext(sparkConf)
val sqlContext = new SQLContext(sc)
// read from orders phoenix table
val df = sqlContext.phoenixTableAsDataFrame("TABLE1",
Seq.apply("ID", "COL1"),
zkUrl = Some.apply(s"${zkQuorum}") )
}
}
请从您的命令中删除 ""
。
尝试这样的事情。
./spark-submit --class spark.phoenix.a --master local --deploy-mode client /home/ec2-user/phoenix-0.0.1-SNAPSHOT.jar
我认为问题出在您的 pom.xml 文件上(如果您使用的是 Maven)。你还没有添加 maven-shade-plugin.
将以下属性添加到您的 pom.xml 并尝试一次。
<build>
<resources>
<resource>
<directory>src/main/resources</directory>
</resource>
</resources>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.2</version>
<executions>
<execution>
<id>eclipse-add-source</id>
<goals>
<goal>add-source</goal>
</goals>
</execution>
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>scala-test-compile-first</id>
<phase>process-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
<execution>
<id>attach-scaladocs</id>
<phase>verify</phase>
<goals>
<goal>doc-jar</goal>
</goals>
</execution>
</executions>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
<recompileMode>incremental</recompileMode>
<useZincServer>true</useZincServer>
<args>
<arg>-unchecked</arg>
<arg>-deprecation</arg>
<arg>-feature</arg>
</args>
<jvmArgs>
<jvmArg>-Xms1024m</jvmArg>
<jvmArg>-Xmx1024m</jvmArg>
<jvmArg>-XX:ReservedCodeCacheSize=${CodeCacheSize}</jvmArg>
</jvmArgs>
<javacArgs>
<javacArg>-source</javacArg>
<javacArg>${java.version}</javacArg>
<javacArg>-target</javacArg>
<javacArg>${java.version}</javacArg>
<javacArg>-Xlint:all,-serial,-path</javacArg>
</javacArgs>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes>
<exclude>org.xerial.snappy</exclude>
<exclude>org.scala-lang.modules</exclude>
<exclude>org.scala-lang</exclude>
</excludes>
</artifactSet>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<relocations>
<relocation>
<pattern>com.google.common</pattern>
<shadedPattern>shaded.com.google.common</shadedPattern>
</relocation>
</relocations>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
我是 Spark (Scala) 的新手,我正在尝试通过 spark 提交 运行 一个 spark 应用程序。不幸的是,我遇到了 java.lang.ClassNotFoundException
异常。
这是我的 spark 提交命令:
./spark-submit --class "spark.phoenix.a" --master local --deploy-mode client /home/ec2-user/phoenix-0.0.1-SNAPSHOT.jar
这是我的例外:
java.lang.ClassNotFoundException: spark.phoenix.a
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:175)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:689)
at org.apache.spark.deploy.SparkSubmit$.doRunMain(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
这是我的示例代码:
package spark.phoenix
import org.apache.spark.SparkConf
import org.apache.spark.sql.SQLContext
object a {
def main(args: Array[String]) {
import org.apache.phoenix.spark._
import org.apache.spark.SparkContext
val zkQuorum = "localhost:2181"
val master = "local[*]"
val sparkConf = new SparkConf()
sparkConf.setAppName("phoenix-spark-save")
.setMaster(s"${master}")
val sc = new SparkContext(sparkConf)
val sqlContext = new SQLContext(sc)
// read from orders phoenix table
val df = sqlContext.phoenixTableAsDataFrame("TABLE1",
Seq.apply("ID", "COL1"),
zkUrl = Some.apply(s"${zkQuorum}") )
}
}
请从您的命令中删除 ""
。
尝试这样的事情。
./spark-submit --class spark.phoenix.a --master local --deploy-mode client /home/ec2-user/phoenix-0.0.1-SNAPSHOT.jar
我认为问题出在您的 pom.xml 文件上(如果您使用的是 Maven)。你还没有添加 maven-shade-plugin.
将以下属性添加到您的 pom.xml 并尝试一次。
<build>
<resources>
<resource>
<directory>src/main/resources</directory>
</resource>
</resources>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.2</version>
<executions>
<execution>
<id>eclipse-add-source</id>
<goals>
<goal>add-source</goal>
</goals>
</execution>
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>scala-test-compile-first</id>
<phase>process-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
<execution>
<id>attach-scaladocs</id>
<phase>verify</phase>
<goals>
<goal>doc-jar</goal>
</goals>
</execution>
</executions>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
<recompileMode>incremental</recompileMode>
<useZincServer>true</useZincServer>
<args>
<arg>-unchecked</arg>
<arg>-deprecation</arg>
<arg>-feature</arg>
</args>
<jvmArgs>
<jvmArg>-Xms1024m</jvmArg>
<jvmArg>-Xmx1024m</jvmArg>
<jvmArg>-XX:ReservedCodeCacheSize=${CodeCacheSize}</jvmArg>
</jvmArgs>
<javacArgs>
<javacArg>-source</javacArg>
<javacArg>${java.version}</javacArg>
<javacArg>-target</javacArg>
<javacArg>${java.version}</javacArg>
<javacArg>-Xlint:all,-serial,-path</javacArg>
</javacArgs>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes>
<exclude>org.xerial.snappy</exclude>
<exclude>org.scala-lang.modules</exclude>
<exclude>org.scala-lang</exclude>
</excludes>
</artifactSet>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<relocations>
<relocation>
<pattern>com.google.common</pattern>
<shadedPattern>shaded.com.google.common</shadedPattern>
</relocation>
</relocations>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>