重载方法值createDirectStream报错Spark Streaming
Overloaded method value createDirectStream in error Spark Streaming
拨打电话时
val kafkaParams: Map[String, String] =...
var topic: String = ..
val input2 = KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](ssc, kafkaParams, topic.toSet)
我收到错误:
overloaded method value createDirectStream with alternatives: (jssc:
org.apache.spark.streaming.api.java.JavaStreamingContext,keyClass:
Class[String],valueClass: Class[String],keyDecoderClass:
Class[kafka.serializer.StringDecoder],valueDecoderClass:
Class[kafka.serializer.StringDecoder],kafkaParams:
java.util.Map[String,String],topics:
java.util.Set[String])org.apache.spark.streaming.api.java.JavaPairInputDStream[String,String]
(ssc: org.apache.spark.streaming.StreamingContext,kafkaParams:
scala.collection.immutable.Map[String,String],topics:
scala.collection.immutable.Set[String])(implicit evidence:
scala.reflect.ClassTag[String], implicit evidence:
scala.reflect.ClassTag[String], implicit evidence:
scala.reflect.ClassTag[kafka.serializer.StringDecoder], implicit
evidence:
scala.reflect.ClassTag[kafka.serializer.StringDecoder])org.apache.spark.streaming.dstream.InputDStream[(String,
String)] cannot be applied to
(org.apache.spark.streaming.StreamingContext,
scala.collection.immutable.Map[String,String],
scala.collection.immutable.Set[Char])
我在调用createStream的参数化版本时也遇到了类似的错误。
知道问题出在哪里吗?
这是一条很长的消息,说主题需要 Set[String]
,而不是 Set[Char]
。
我认为解决此问题的最佳方法是:
topic.map(_.toString).toSet
但是,如果您真的只有一个主题,那么只需 Set(topic)
即可,因为上面的操作会将字符串拆分为一组单个字符。
拨打电话时
val kafkaParams: Map[String, String] =...
var topic: String = ..
val input2 = KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](ssc, kafkaParams, topic.toSet)
我收到错误:
overloaded method value createDirectStream with alternatives: (jssc: org.apache.spark.streaming.api.java.JavaStreamingContext,keyClass: Class[String],valueClass: Class[String],keyDecoderClass: Class[kafka.serializer.StringDecoder],valueDecoderClass: Class[kafka.serializer.StringDecoder],kafkaParams: java.util.Map[String,String],topics: java.util.Set[String])org.apache.spark.streaming.api.java.JavaPairInputDStream[String,String] (ssc: org.apache.spark.streaming.StreamingContext,kafkaParams: scala.collection.immutable.Map[String,String],topics: scala.collection.immutable.Set[String])(implicit evidence: scala.reflect.ClassTag[String], implicit evidence: scala.reflect.ClassTag[String], implicit evidence: scala.reflect.ClassTag[kafka.serializer.StringDecoder], implicit evidence: scala.reflect.ClassTag[kafka.serializer.StringDecoder])org.apache.spark.streaming.dstream.InputDStream[(String, String)] cannot be applied to (org.apache.spark.streaming.StreamingContext, scala.collection.immutable.Map[String,String], scala.collection.immutable.Set[Char])
我在调用createStream的参数化版本时也遇到了类似的错误。
知道问题出在哪里吗?
这是一条很长的消息,说主题需要 Set[String]
,而不是 Set[Char]
。
我认为解决此问题的最佳方法是:
topic.map(_.toString).toSet
但是,如果您真的只有一个主题,那么只需 Set(topic)
即可,因为上面的操作会将字符串拆分为一组单个字符。