在 java 上需要一些关于为 cassandra 设置 spark 的帮助
Need some help on setting up spark for cassandra on java
设置 spark 以在 java 上访问 cassandra 抛出 NoClassDefFoundError
Exception in thread "main" java.lang.NoClassDefFoundError: scala/Cloneable
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(Unknown Source)
at java.security.SecureClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.access0(Unknown Source)
at java.net.URLClassLoader.run(Unknown Source)
at java.net.URLClassLoader.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at Client.main(Client.java:22)
Caused by: java.lang.ClassNotFoundException: scala.Cloneable
at java.net.URLClassLoader.run(Unknown Source)
at java.net.URLClassLoader.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
... 13 more
添加了两个jar 文件。 spark-cassandra-connector-java-assembly-1.4.0-M1-SNAPSHOT.jar & spark-core_2.10-0.9.0-incubating.jar. spark-cassandra-connector-java-assembly-1.4.0-M1-SNAPSHOT.jar 是针对 scala 2.10 构建的。
在显示 scala 代码运行器版本 2.11.6 的命令提示符下键入 scala -version。从 spark-shell 访问 spark 没有问题。甚至从 spark-shell 访问 cassandra 列族也工作正常。
import java.util.*;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import com.datastax.spark.connector.*;
import com.datastax.spark.connector.cql.*;
import com.datastax.spark.*;
import org.apache.spark.SparkConf;
import org.apache.spark.SparkContext;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.PairFunction;
//import scala.Tuple2;
import org.apache.spark.api.java.*;
public class Client {
public static void main(String[] a)
{
SparkConf conf = new SparkConf().setAppName("MTMPNLTesting").setMaster("192.168.1.15");
}
}
错误的原因可能是什么??
也尝试在您的 class 路径中包含 Scala Jar。如果您不使用 Maven,请下载 jar 并将其包含在项目构建属性中。
设置 spark 以在 java 上访问 cassandra 抛出 NoClassDefFoundError
Exception in thread "main" java.lang.NoClassDefFoundError: scala/Cloneable
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(Unknown Source)
at java.security.SecureClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.access0(Unknown Source)
at java.net.URLClassLoader.run(Unknown Source)
at java.net.URLClassLoader.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at Client.main(Client.java:22)
Caused by: java.lang.ClassNotFoundException: scala.Cloneable
at java.net.URLClassLoader.run(Unknown Source)
at java.net.URLClassLoader.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
... 13 more
添加了两个jar 文件。 spark-cassandra-connector-java-assembly-1.4.0-M1-SNAPSHOT.jar & spark-core_2.10-0.9.0-incubating.jar. spark-cassandra-connector-java-assembly-1.4.0-M1-SNAPSHOT.jar 是针对 scala 2.10 构建的。 在显示 scala 代码运行器版本 2.11.6 的命令提示符下键入 scala -version。从 spark-shell 访问 spark 没有问题。甚至从 spark-shell 访问 cassandra 列族也工作正常。
import java.util.*;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import com.datastax.spark.connector.*;
import com.datastax.spark.connector.cql.*;
import com.datastax.spark.*;
import org.apache.spark.SparkConf;
import org.apache.spark.SparkContext;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.PairFunction;
//import scala.Tuple2;
import org.apache.spark.api.java.*;
public class Client {
public static void main(String[] a)
{
SparkConf conf = new SparkConf().setAppName("MTMPNLTesting").setMaster("192.168.1.15");
}
}
错误的原因可能是什么??
也尝试在您的 class 路径中包含 Scala Jar。如果您不使用 Maven,请下载 jar 并将其包含在项目构建属性中。