原因:java.lang.ClassNotFoundException:org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat
我正在使用 sparklauncher 启动 spark 应用程序。在 spark 应用程序中,我将数据插入配置单元 table 并在连接查询中使用一些 hbase-hive 链接 table。我已经在 spark 启动器中添加了 hive-hbase-handler-1.1.0-cdh5.13.0.jar,但我仍然会被 Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat 我已经添加了 jar火花发射器错误:
def launch(hdfsFilePath:String):Unit={
println("Inside ApplicationLauncher")
val command = new SparkLauncher()
.setAppResource("/home/cloudera/Desktop/Avi/LiveProjects/MusicDataAnalysis/target/scala-2.11/musicdataanalysis_2.11-0.1.jar")
.setMainClass("ParseInputFile")
.setVerbose(false)
.addAppArgs(hdfsFilePath)
.setMaster("local")
.addJar("/home/cloudera/Desktop/Avi/jars/hive-hbase-handler-1.1.0-cdh5.13.0.jar")
.addJar("/home/cloudera/Desktop/Avi/jars/spark-xml_2.11-0.5.0.jar")
println("Done with Spark Launcher")
val appHandle = command.startApplication()
appHandle.addListener(new SparkAppHandle.Listener{
def infoChanged(sparkAppHandle : SparkAppHandle) : Unit = {
// println(sparkAppHandle.getState + " Custom Print")
}
def stateChanged(sparkAppHandle : SparkAppHandle) : Unit = {
println(sparkAppHandle.getState)
if ("FINISHED".equals(sparkAppHandle.getState.toString)){
sparkAppHandle.stop
}
}
})
当我添加 hbase-0.92.1.jar 和 hive-hbase-handler-1.1.0-cdh5.13.0.jar 后,我的问题就解决了。请找到以下工作代码:
def launch(hdfsFilePath:String):Unit={
println("Inside ApplicationLauncher")
val command = new SparkLauncher()
.setAppResource("/home/cloudera/Desktop/Avi/LiveProjects/MusicDataAnalysis/target/scala-2.11/musicdataanalysis_2.11-0.1.jar")
.setMainClass("ParseInputFile")
.setVerbose(false)
.addAppArgs(hdfsFilePath)
.setMaster("local")
.addJar("file:///home/cloudera/Desktop/Avi/jars/hbase-0.92.1.jar")
.addJar("file:///home/cloudera/Desktop/Avi/jars/hive-hbase-handler-3.1.1.jar")
.addJar("file:///home/cloudera/Desktop/Avi/jars/spark-xml_2.11-0.5.0.jar")
println("Done with Spark Launcher")
val appHandle = command.startApplication()
appHandle.addListener(new SparkAppHandle.Listener{
def infoChanged(sparkAppHandle : SparkAppHandle) : Unit = {
// println(sparkAppHandle.getState + " Custom Print")
}
def stateChanged(sparkAppHandle : SparkAppHandle) : Unit = {
println(sparkAppHandle.getState)
if ("FINISHED".equals(sparkAppHandle.getState.toString)){
sparkAppHandle.stop
}
}
})
}
我正在使用 sparklauncher 启动 spark 应用程序。在 spark 应用程序中,我将数据插入配置单元 table 并在连接查询中使用一些 hbase-hive 链接 table。我已经在 spark 启动器中添加了 hive-hbase-handler-1.1.0-cdh5.13.0.jar,但我仍然会被 Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat 我已经添加了 jar火花发射器错误:
def launch(hdfsFilePath:String):Unit={
println("Inside ApplicationLauncher")
val command = new SparkLauncher()
.setAppResource("/home/cloudera/Desktop/Avi/LiveProjects/MusicDataAnalysis/target/scala-2.11/musicdataanalysis_2.11-0.1.jar")
.setMainClass("ParseInputFile")
.setVerbose(false)
.addAppArgs(hdfsFilePath)
.setMaster("local")
.addJar("/home/cloudera/Desktop/Avi/jars/hive-hbase-handler-1.1.0-cdh5.13.0.jar")
.addJar("/home/cloudera/Desktop/Avi/jars/spark-xml_2.11-0.5.0.jar")
println("Done with Spark Launcher")
val appHandle = command.startApplication()
appHandle.addListener(new SparkAppHandle.Listener{
def infoChanged(sparkAppHandle : SparkAppHandle) : Unit = {
// println(sparkAppHandle.getState + " Custom Print")
}
def stateChanged(sparkAppHandle : SparkAppHandle) : Unit = {
println(sparkAppHandle.getState)
if ("FINISHED".equals(sparkAppHandle.getState.toString)){
sparkAppHandle.stop
}
}
})
当我添加 hbase-0.92.1.jar 和 hive-hbase-handler-1.1.0-cdh5.13.0.jar 后,我的问题就解决了。请找到以下工作代码:
def launch(hdfsFilePath:String):Unit={
println("Inside ApplicationLauncher")
val command = new SparkLauncher()
.setAppResource("/home/cloudera/Desktop/Avi/LiveProjects/MusicDataAnalysis/target/scala-2.11/musicdataanalysis_2.11-0.1.jar")
.setMainClass("ParseInputFile")
.setVerbose(false)
.addAppArgs(hdfsFilePath)
.setMaster("local")
.addJar("file:///home/cloudera/Desktop/Avi/jars/hbase-0.92.1.jar")
.addJar("file:///home/cloudera/Desktop/Avi/jars/hive-hbase-handler-3.1.1.jar")
.addJar("file:///home/cloudera/Desktop/Avi/jars/spark-xml_2.11-0.5.0.jar")
println("Done with Spark Launcher")
val appHandle = command.startApplication()
appHandle.addListener(new SparkAppHandle.Listener{
def infoChanged(sparkAppHandle : SparkAppHandle) : Unit = {
// println(sparkAppHandle.getState + " Custom Print")
}
def stateChanged(sparkAppHandle : SparkAppHandle) : Unit = {
println(sparkAppHandle.getState)
if ("FINISHED".equals(sparkAppHandle.getState.toString)){
sparkAppHandle.stop
}
}
})
}