如何使用结构类型创建 scala case class?
how to create scala case class with struct types?
我的数据框模式如下所示,它是通过定义一个案例创建的 class:
|-- _id: struct (nullable = true)
| |-- oid: string (nullable = true)
|-- message: string (nullable = true)
|-- powerData: array (nullable = true)
| |-- element: struct (containsNull = true)
| | |-- current: array (nullable = true)
| | | |-- element: double (containsNull = true)
| | |-- delayStartTime: double (nullable = true)
| | |-- idSub1: string (nullable = true)
| | |-- motorNumber: integer (nullable = true)
| | |-- power: array (nullable = true)
| | | |-- element: double (containsNull = true)
我像这样创建了案例 class,但不确定如何在这种情况下声明 StructFields class。
case class CurrentSchema(_id: StructType, message: String, powerData: Array[StructType]
在对我的 DF 应用架构时出现此错误。
val dfRef = MongoSpark.load[CurrentSchema](sparkSessionRef)
Exception in thread "main" scala.MatchError: org.apache.spark.sql.types.StructType (of class scala.reflect.internal.Types$ClassNoArgsTypeRef)
有人这样做过吗?寻求帮助。
提前致谢。
您将必须为每个结构创建单独的案例类。
case class IdStruct(old: String)
case class PdStruct(current: Array[Double], delayStartTime: Double, idSub1: String, motorNumber: Int, power: Array[Double])
case class CurrentSchema(_id: IdStruct, message: String, powerData: Array[PdStruct])
我的数据框模式如下所示,它是通过定义一个案例创建的 class:
|-- _id: struct (nullable = true)
| |-- oid: string (nullable = true)
|-- message: string (nullable = true)
|-- powerData: array (nullable = true)
| |-- element: struct (containsNull = true)
| | |-- current: array (nullable = true)
| | | |-- element: double (containsNull = true)
| | |-- delayStartTime: double (nullable = true)
| | |-- idSub1: string (nullable = true)
| | |-- motorNumber: integer (nullable = true)
| | |-- power: array (nullable = true)
| | | |-- element: double (containsNull = true)
我像这样创建了案例 class,但不确定如何在这种情况下声明 StructFields class。
case class CurrentSchema(_id: StructType, message: String, powerData: Array[StructType]
在对我的 DF 应用架构时出现此错误。
val dfRef = MongoSpark.load[CurrentSchema](sparkSessionRef)
Exception in thread "main" scala.MatchError: org.apache.spark.sql.types.StructType (of class scala.reflect.internal.Types$ClassNoArgsTypeRef)
有人这样做过吗?寻求帮助。
提前致谢。
您将必须为每个结构创建单独的案例类。
case class IdStruct(old: String)
case class PdStruct(current: Array[Double], delayStartTime: Double, idSub1: String, motorNumber: Int, power: Array[Double])
case class CurrentSchema(_id: IdStruct, message: String, powerData: Array[PdStruct])