please have a look at the following code snippet:
import org.apache.spark.streaming.dstream.DStream
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.rdd.RDD
import org.apache.spark.streaming.Time
import org.apache.spark.streaming.Seconds
abstract class MQTTDStream[T <: Any](ssc: StreamingContext) extends DStream(ssc) {
override def compute(validTime: Time): Option[RDD[T]] =
Some(ssc.sparkContext.parallelize(Seq(1, 2, 3), 1)) //This line doesn't compile
override def dependencies = Nil
override def slideDuration = Seconds(1) // just an example
}
I get the following error:
type mismatch; found : Int(1) required: T
I've declared T to extend Any, so why is the compiler complaining? Int is a sub-type of Any, isn't it?
Thanks a lot!
Update: 2.9.16:
Changed to extend from DStream[Int] but still the same error:
abstract class MQTTDStream[T](ssc: StreamingContext) extends DStream[Int](ssc) {
override def compute(validTime: Time): Option[RDD[T]] =
Some(ssc.sparkContext.parallelize(Seq(1, 2, 3), 1)) //This line doesn't compile
override def dependencies = Nil
override def slideDuration = Seconds(1) // just an example
}
EDIT: 2.9.16:
Thanks to Alexey, this is the working solution:
import org.apache.spark.streaming.dstream.DStream
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.rdd.RDD
import org.apache.spark.streaming.Time
import org.apache.spark.streaming.Seconds
abstract class MQTTDStream[T](ssc: StreamingContext) extends DStream[Int](ssc) {
override def compute(validTime: Time): Option[RDD[Int]] =
Some(ssc.sparkContext.parallelize(Seq(1, 2, 3), 1))
override def dependencies = Nil
override def slideDuration = Seconds(1) // just an example
}
The caller gets to pick T
, not you. So your class definition must work for all T
(which satisfies type bounds, but all T
are subtypes of Any
).
That is, if someone creates e.g. a MQTTDStream[String]
, then its compute
method has to return an Option[RDD[String]]
. But it doesn't: it returns Some[RDD[Int]]
.