Spark Streaming Kafka 消费者不喜欢 DStream

Spark Streaming Kafka consumer doesn't like DStream

我正在使用 Spark Shell(Scala 2.10 和 Spark Streaming org.apache.spark:spark-streaming-kafka-0-10_2.10:2.0.1)来测试 Spark/Kafka 消费者:

import org.apache.kafka.clients.consumer.ConsumerRecord
import org.apache.kafka.common.serialization.StringDeserializer
import org.apache.spark.streaming.kafka010._
import org.apache.spark.streaming.kafka010.LocationStrategies.PreferConsistent
import org.apache.spark.streaming.kafka010.ConsumerStrategies.Subscribe
import org.apache.spark._
import org.apache.spark.streaming._
import org.apache.spark.streaming.dstream.DStream

val kafkaParams = Map[String, Object](
  "bootstrap.servers" -> "mykafka01.example.com:9092",
  "key.deserializer" -> classOf[StringDeserializer],
  "value.deserializer" -> classOf[StringDeserializer],
  "group.id" -> "mykafka",
  "auto.offset.reset" -> "latest",
  "enable.auto.commit" -> (false: java.lang.Boolean)
)

val topics = Array("mytopic")

def createKafkaStream(ssc: StreamingContext, topics: Array[String], kafkaParams: Map[String,Object]) : DStream[(String, String)] = {
    KafkaUtils.createDirectStream[String, String](ssc, PreferConsistent, Subscribe[String, String](topics, kafkaParams))
}

def messageConsumer(): StreamingContext = {
    val ssc = new StreamingContext(SparkContext.getOrCreate(), Seconds(10))

    createKafkaStream(ssc, topics, kafkaParams).foreachRDD(rdd => {
        rdd.collect().foreach { msg =>
            try {
                println("Received message: " + msg._2)
            } catch {
                case e @ (_: Exception | _: Error | _: Throwable) => {
                println("Exception: " + e.getMessage)
                e.printStackTrace()
            }
          }
        }
    })

    ssc
}

val ssc = StreamingContext.getActiveOrCreate(messageConsumer)
ssc.start()
ssc.awaitTermination()

当我 运行 时,我得到以下异常:

<console>:60: error: type mismatch;
 found   : org.apache.spark.streaming.dstream.InputDStream[org.apache.kafka.clients.consumer.ConsumerRecord[String,String]]
 required: org.apache.spark.streaming.dstream.DStream[(String, String)]
                  KafkaUtils.createDirectStream[String, String](ssc, PreferConsistent, Subscribe[String, String](topics, kafkaParams))
                                                               ^

我一遍又一遍地检查了 Scala/API 文档,这段代码 看起来 应该可以正确执行。知道我哪里出错了吗?

Subscribetopics 参数作为 Array[String],您正在按照 def createKafkaStream(ssc: StreamingContext, topics: String, 传递单个字符串。将参数类型更改为 Array[String](并适当地调用它)将解决问题。