org/apache/spark/Logging Hbase-Spark 问题
org/apache/spark/Logging issue on Hbase-Spark
我想在spark上处理数据,然后插入到HBase中。我在用
Hbase-Spark (Apache HBase) 库 (https://mvnrepository.com/artifact/org.apache.hbase/hbase-spark/2.0.0-alpha4)
我得到以下异常,
Caused by: java.lang.NoClassDefFoundError: org/apache/spark/Logging
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access0(URLClassLoader.java:73)
at java.net.URLClassLoader.run(URLClassLoader.java:368)
at java.net.URLClassLoader.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.hadoop.hbase.spark.JavaHBaseContext.<init>(JavaHBaseContext.scala:46)
at job.sqoop_process.HostLookupGenerator.insert(HostLookupGenerator.java:44)
at job.sqoop_process.SparkSqoopJob.process(SparkSqoopJob.java:17)
at job.spark.SparkExecutor$Executor.execute(SparkExecutor.java:75)
at job.spark.SparkExecutor.main(SparkExecutor.java:37)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon.run(ApplicationMaster.scala:721)
问题是 HBase-Spark-2.0.0-alpha4 版本使用了 spark Logging class,它在 Spark-2.3.1 中已弃用。
谁能帮我解决这个问题。
注意:我使用的是 Spark 2.3.1 和 HBase 1.2.6.1
我正在使用 Spark 2。2.x 我遇到了同样的问题。我忘记了我设法摆脱了以下哪些依赖项,但请查看您的配置中缺少哪个并尝试添加它。我应该工作。我相信这是火花流:
<dependency>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
<version>1.1.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-spark</artifactId>
<version>${hbase-spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-mapreduce</artifactId>
<version>${hbase-spark.version}</version>
</dependency>
截至今天,我可以使用 https://mvnrepository.com/artifact/org.apache.hbase.connectors.spark/hbase-spark/1.0.0
上提供的版本解决它
<!-- https://mvnrepository.com/artifact/org.apache.hbase.connectors.spark/hbase-spark -->
<dependency>
<groupId>org.apache.hbase.connectors.spark</groupId>
<artifactId>hbase-spark</artifactId>
<version>1.0.0</version>
</dependency>
我想在spark上处理数据,然后插入到HBase中。我在用 Hbase-Spark (Apache HBase) 库 (https://mvnrepository.com/artifact/org.apache.hbase/hbase-spark/2.0.0-alpha4)
我得到以下异常,
Caused by: java.lang.NoClassDefFoundError: org/apache/spark/Logging
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access0(URLClassLoader.java:73)
at java.net.URLClassLoader.run(URLClassLoader.java:368)
at java.net.URLClassLoader.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.hadoop.hbase.spark.JavaHBaseContext.<init>(JavaHBaseContext.scala:46)
at job.sqoop_process.HostLookupGenerator.insert(HostLookupGenerator.java:44)
at job.sqoop_process.SparkSqoopJob.process(SparkSqoopJob.java:17)
at job.spark.SparkExecutor$Executor.execute(SparkExecutor.java:75)
at job.spark.SparkExecutor.main(SparkExecutor.java:37)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon.run(ApplicationMaster.scala:721)
问题是 HBase-Spark-2.0.0-alpha4 版本使用了 spark Logging class,它在 Spark-2.3.1 中已弃用。
谁能帮我解决这个问题。
注意:我使用的是 Spark 2.3.1 和 HBase 1.2.6.1
我正在使用 Spark 2。2.x 我遇到了同样的问题。我忘记了我设法摆脱了以下哪些依赖项,但请查看您的配置中缺少哪个并尝试添加它。我应该工作。我相信这是火花流:
<dependency>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
<version>1.1.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-spark</artifactId>
<version>${hbase-spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-mapreduce</artifactId>
<version>${hbase-spark.version}</version>
</dependency>
截至今天,我可以使用 https://mvnrepository.com/artifact/org.apache.hbase.connectors.spark/hbase-spark/1.0.0
上提供的版本解决它<!-- https://mvnrepository.com/artifact/org.apache.hbase.connectors.spark/hbase-spark -->
<dependency>
<groupId>org.apache.hbase.connectors.spark</groupId>
<artifactId>hbase-spark</artifactId>
<version>1.0.0</version>
</dependency>