用于 spark 案例的 scala 通用编码器 class
scala generic encoder for spark case class
如何让这个方法编译。奇怪的是,sparks implicit 已经导入了。
def loadDsFromHive[T <: Product](tableName: String, spark: SparkSession): Dataset[T] = {
import spark.implicits._
spark.sql(s"SELECT * FROM $tableName").as[T]
}
这是错误:
Unable to find encoder for type stored in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._ Support for serializing other types will be added in future releases.
[error] spark.sql(s"SELECT * FROM $tableName").as[T]
根据 org.apache.spark.sql.SQLImplicits
的源代码,您的类型需要类型 class TypeTag
,以便隐式 Encoder
存在:
import scala.reflect.runtime.universe.TypeTag
def loadDsFromHive[T <: Product: TypeTag](tableName: String, spark: SparkSession): Dataset[T] = ...
如何让这个方法编译。奇怪的是,sparks implicit 已经导入了。
def loadDsFromHive[T <: Product](tableName: String, spark: SparkSession): Dataset[T] = {
import spark.implicits._
spark.sql(s"SELECT * FROM $tableName").as[T]
}
这是错误:
Unable to find encoder for type stored in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._ Support for serializing other types will be added in future releases.
[error] spark.sql(s"SELECT * FROM $tableName").as[T]
根据 org.apache.spark.sql.SQLImplicits
的源代码,您的类型需要类型 class TypeTag
,以便隐式 Encoder
存在:
import scala.reflect.runtime.universe.TypeTag
def loadDsFromHive[T <: Product: TypeTag](tableName: String, spark: SparkSession): Dataset[T] = ...