Return 实现 Spark DStream 时类型重载
Return type overloading when implementing a Spark DStream
目前正在尝试实现我自己的 ApacheSpark V2.0 DStream
import org.apache.spark.streaming.dstream.DStream
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.rdd.RDD
import org.apache.spark.streaming.Time
class MQTTDStream[T](ssc: StreamingContext) extends DStream(ssc) {
override def compute(validTime: Time): RDD[Int] = { ssc.sparkContext.parallelize(Array(1, 2, 3), 1) }
}
我的 eclipse 环境中的编译器可以接受。但是我将代码粘贴到 IBM DSExperience 中的 jupyter notebook 并收到以下错误:
Name: Compile Error Message: :21: error: overriding method
compute in class DStream of type (validTime:
org.apache.spark.streaming.Time)Option[org.apache.spark.rdd.RDD[Nothing]];
method compute has incompatible type
override def compute(validTime: Time): RDD[Int] = { ssc.sparkContext.parallelize(Array(1, 2, 3), 1) }
^ :20: error: class MQTTDStream needs to be abstract, since: it has 2 unimplemented members. /** As seen
from class MQTTDStream, the missing signatures are as follows. * For
convenience, these are usable as stub implementations. */ def
dependencies: List[org.apache.spark.streaming.dstream.DStream[_]] =
??? def slideDuration: org.apache.spark.streaming.Duration = ???
class MQTTDStream[T](ssc: StreamingContext) extends DStream(ssc) {
^ StackTrace:
编辑:31.8.16
现在我进步了一点:
abstract class MQTTDStream[T](ssc: StreamingContext) extends DStream(ssc) {
override def compute(validTime: Time): Option[RDD[T]] =
Some(ssc.sparkContext.parallelize(Seq(1, 2, 3), 1))
override def dependencies = Nil
override def slideDuration = Seconds(1) // just an example
}
给我:
type mismatch; found : Int(1) required: T
- 您缺少
DStream
的类型参数(这是错误消息中 Nothing
的来源); 2. compute
应该 return 一个 Option[RDD[Something]]
,而不仅仅是 RDD[Something]
; 3.你还需要定义dependencies
和slideDuration
.
所以最少的变化是
class MQTTDStream[T](ssc: StreamingContext) extends DStream[Int](ssc) {
override def compute(validTime: Time): Option[RDD[Int]] =
Some(ssc.sparkContext.parallelize(Array(1, 2, 3), 1))
override def dependencies = Nil
override def slideDuration = Seconds(1) // just an example
}
目前正在尝试实现我自己的 ApacheSpark V2.0 DStream
import org.apache.spark.streaming.dstream.DStream
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.rdd.RDD
import org.apache.spark.streaming.Time
class MQTTDStream[T](ssc: StreamingContext) extends DStream(ssc) {
override def compute(validTime: Time): RDD[Int] = { ssc.sparkContext.parallelize(Array(1, 2, 3), 1) }
}
我的 eclipse 环境中的编译器可以接受。但是我将代码粘贴到 IBM DSExperience 中的 jupyter notebook 并收到以下错误:
Name: Compile Error Message: :21: error: overriding method compute in class DStream of type (validTime: org.apache.spark.streaming.Time)Option[org.apache.spark.rdd.RDD[Nothing]]; method compute has incompatible type override def compute(validTime: Time): RDD[Int] = { ssc.sparkContext.parallelize(Array(1, 2, 3), 1) } ^ :20: error: class MQTTDStream needs to be abstract, since: it has 2 unimplemented members. /** As seen from class MQTTDStream, the missing signatures are as follows. * For convenience, these are usable as stub implementations. */ def dependencies: List[org.apache.spark.streaming.dstream.DStream[_]] = ??? def slideDuration: org.apache.spark.streaming.Duration = ???
class MQTTDStream[T](ssc: StreamingContext) extends DStream(ssc) { ^ StackTrace:
编辑:31.8.16
现在我进步了一点:
abstract class MQTTDStream[T](ssc: StreamingContext) extends DStream(ssc) {
override def compute(validTime: Time): Option[RDD[T]] =
Some(ssc.sparkContext.parallelize(Seq(1, 2, 3), 1))
override def dependencies = Nil
override def slideDuration = Seconds(1) // just an example
}
给我:
type mismatch; found : Int(1) required: T
- 您缺少
DStream
的类型参数(这是错误消息中Nothing
的来源); 2.compute
应该 return 一个Option[RDD[Something]]
,而不仅仅是RDD[Something]
; 3.你还需要定义dependencies
和slideDuration
.
所以最少的变化是
class MQTTDStream[T](ssc: StreamingContext) extends DStream[Int](ssc) {
override def compute(validTime: Time): Option[RDD[Int]] =
Some(ssc.sparkContext.parallelize(Array(1, 2, 3), 1))
override def dependencies = Nil
override def slideDuration = Seconds(1) // just an example
}