Spark streaming StreamingContext.start() - 启动接收器 0 时出错
Spark streaming StreamingContext.start() - Error starting receiver 0
我有一个使用 Spark Streaming 的项目,我 运行 将其与 'spark-submit' 结合使用,但我遇到了这个错误:
15/01/14 10:34:18 ERROR ReceiverTracker: Deregistered receiver for stream 0: Error starting receiver 0 - java.lang.AbstractMethodError
at org.apache.spark.Logging$class.log(Logging.scala:52)
at org.apache.spark.streaming.kafka.KafkaReceiver.log(KafkaInputDStream.scala:66)
at org.apache.spark.Logging$class.logInfo(Logging.scala:59)
at org.apache.spark.streaming.kafka.KafkaReceiver.logInfo(KafkaInputDStream.scala:66)
at org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStream.scala:86)
at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:121)
at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:106)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun.apply(ReceiverTracker.scala:264)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun.apply(ReceiverTracker.scala:257)
at org.apache.spark.SparkContext$$anonfun$runJob.apply(SparkContext.scala:1121)
at org.apache.spark.SparkContext$$anonfun$runJob.apply(SparkContext.scala:1121)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
at org.apache.spark.scheduler.Task.run(Task.scala:54)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
这是错误的来源代码,运行一切正常,直到 ssc.start()
val Array(zkQuorum, group, topics, numThreads) = args
val sparkConf = new SparkConf().setAppName("Jumbly_StreamingConsumer")
val ssc = new StreamingContext(sparkConf, Seconds(2))
ssc.checkpoint("checkpoint")
.
.
.
ssc.start()
ssc.awaitTermination()
我已经 运行 使用 'spark-submit' 的 SparkPi 示例并且它 运行 很好所以我似乎无法弄清楚是什么导致了我的应用程序问题,任何帮助将不胜感激。
来自 java.lang.AbstractMethod
的文档:
Normally, this error is caught by the compiler; this error can only
occur at run time if the definition of some class has incompatibly
changed since the currently executing method was last compiled.
这意味着编译依赖项和运行时依赖项之间存在版本不兼容。确保对齐这些版本以解决此问题。
我有一个使用 Spark Streaming 的项目,我 运行 将其与 'spark-submit' 结合使用,但我遇到了这个错误:
15/01/14 10:34:18 ERROR ReceiverTracker: Deregistered receiver for stream 0: Error starting receiver 0 - java.lang.AbstractMethodError
at org.apache.spark.Logging$class.log(Logging.scala:52)
at org.apache.spark.streaming.kafka.KafkaReceiver.log(KafkaInputDStream.scala:66)
at org.apache.spark.Logging$class.logInfo(Logging.scala:59)
at org.apache.spark.streaming.kafka.KafkaReceiver.logInfo(KafkaInputDStream.scala:66)
at org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStream.scala:86)
at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:121)
at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:106)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun.apply(ReceiverTracker.scala:264)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun.apply(ReceiverTracker.scala:257)
at org.apache.spark.SparkContext$$anonfun$runJob.apply(SparkContext.scala:1121)
at org.apache.spark.SparkContext$$anonfun$runJob.apply(SparkContext.scala:1121)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
at org.apache.spark.scheduler.Task.run(Task.scala:54)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
这是错误的来源代码,运行一切正常,直到 ssc.start()
val Array(zkQuorum, group, topics, numThreads) = args
val sparkConf = new SparkConf().setAppName("Jumbly_StreamingConsumer")
val ssc = new StreamingContext(sparkConf, Seconds(2))
ssc.checkpoint("checkpoint")
.
.
.
ssc.start()
ssc.awaitTermination()
我已经 运行 使用 'spark-submit' 的 SparkPi 示例并且它 运行 很好所以我似乎无法弄清楚是什么导致了我的应用程序问题,任何帮助将不胜感激。
来自 java.lang.AbstractMethod
的文档:
Normally, this error is caught by the compiler; this error can only occur at run time if the definition of some class has incompatibly changed since the currently executing method was last compiled.
这意味着编译依赖项和运行时依赖项之间存在版本不兼容。确保对齐这些版本以解决此问题。