在 Scala 中重写通用类型方法时如何 return 具体类型?
How do I return concrete type when overriding a generically typed method in Scala?
请查看以下代码片段:
import org.apache.spark.streaming.dstream.DStream
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.rdd.RDD
import org.apache.spark.streaming.Time
import org.apache.spark.streaming.Seconds
abstract class MQTTDStream[T <: Any](ssc: StreamingContext) extends DStream(ssc) {
override def compute(validTime: Time): Option[RDD[T]] =
Some(ssc.sparkContext.parallelize(Seq(1, 2, 3), 1)) //This line doesn't compile
override def dependencies = Nil
override def slideDuration = Seconds(1) // just an example
}
我收到以下错误:
type mismatch; found : Int(1) required: T
我已经声明 T 来扩展 Any,为什么编译器会报错? Int 是 Any 的子类型,不是吗?
非常感谢!
更新:2.9.16:
更改为从 DStream[Int] 扩展但仍然是相同的错误:
abstract class MQTTDStream[T](ssc: StreamingContext) extends DStream[Int](ssc) {
override def compute(validTime: Time): Option[RDD[T]] =
Some(ssc.sparkContext.parallelize(Seq(1, 2, 3), 1)) //This line doesn't compile
override def dependencies = Nil
override def slideDuration = Seconds(1) // just an example
}
编辑:2.9.16:
感谢 Alexey,这是可行的解决方案:
import org.apache.spark.streaming.dstream.DStream
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.rdd.RDD
import org.apache.spark.streaming.Time
import org.apache.spark.streaming.Seconds
abstract class MQTTDStream[T](ssc: StreamingContext) extends DStream[Int](ssc) {
override def compute(validTime: Time): Option[RDD[Int]] =
Some(ssc.sparkContext.parallelize(Seq(1, 2, 3), 1))
override def dependencies = Nil
override def slideDuration = Seconds(1) // just an example
}
来电者可以选择 T
,而不是你。因此,您的 class 定义必须适用于所有 T
(满足类型界限,但所有 T
都是 Any
的子类型)。
也就是说,如果有人创建了例如一个 MQTTDStream[String]
,那么它的 compute
方法必须 return 一个 Option[RDD[String]]
。但事实并非如此:它 returns Some[RDD[Int]]
。
请查看以下代码片段:
import org.apache.spark.streaming.dstream.DStream
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.rdd.RDD
import org.apache.spark.streaming.Time
import org.apache.spark.streaming.Seconds
abstract class MQTTDStream[T <: Any](ssc: StreamingContext) extends DStream(ssc) {
override def compute(validTime: Time): Option[RDD[T]] =
Some(ssc.sparkContext.parallelize(Seq(1, 2, 3), 1)) //This line doesn't compile
override def dependencies = Nil
override def slideDuration = Seconds(1) // just an example
}
我收到以下错误:
type mismatch; found : Int(1) required: T
我已经声明 T 来扩展 Any,为什么编译器会报错? Int 是 Any 的子类型,不是吗?
非常感谢!
更新:2.9.16:
更改为从 DStream[Int] 扩展但仍然是相同的错误:
abstract class MQTTDStream[T](ssc: StreamingContext) extends DStream[Int](ssc) {
override def compute(validTime: Time): Option[RDD[T]] =
Some(ssc.sparkContext.parallelize(Seq(1, 2, 3), 1)) //This line doesn't compile
override def dependencies = Nil
override def slideDuration = Seconds(1) // just an example
}
编辑:2.9.16:
感谢 Alexey,这是可行的解决方案:
import org.apache.spark.streaming.dstream.DStream
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.rdd.RDD
import org.apache.spark.streaming.Time
import org.apache.spark.streaming.Seconds
abstract class MQTTDStream[T](ssc: StreamingContext) extends DStream[Int](ssc) {
override def compute(validTime: Time): Option[RDD[Int]] =
Some(ssc.sparkContext.parallelize(Seq(1, 2, 3), 1))
override def dependencies = Nil
override def slideDuration = Seconds(1) // just an example
}
来电者可以选择 T
,而不是你。因此,您的 class 定义必须适用于所有 T
(满足类型界限,但所有 T
都是 Any
的子类型)。
也就是说,如果有人创建了例如一个 MQTTDStream[String]
,那么它的 compute
方法必须 return 一个 Option[RDD[String]]
。但事实并非如此:它 returns Some[RDD[Int]]
。