在范围内找不到火花隐式编码器

spark implicit encoder not found in scope

我在 spark custom kryo encoder not providing schema for UDF 中已经概述了 spark 的问题,但现在创建了一个最小示例: https://gist.github.com/geoHeil/dc9cfb8eca5c06fca01fc9fc03431b2f

class SomeOtherClass(foo: Int)
case class FooWithSomeOtherClass(a: Int, b: String, bar: SomeOtherClass)
case class FooWithoutOtherClass(a: Int, b: String, bar: Int)
case class Foo(a: Int)
implicit val someOtherClassEncoder: Encoder[SomeOtherClass] = Encoders.kryo[SomeOtherClass]
val df2 = Seq(FooWithSomeOtherClass(1, "one", new SomeOtherClass(4))).toDS
val df3 = Seq(FooWithoutOtherClass(1, "one", 1), FooWithoutOtherClass(2, "two", 2)).toDS
val df4 = df3.map(d => FooWithSomeOtherClass(d.a, d.b, new SomeOtherClass(d.bar)))

这里,即使 createDataSet 语句由于

而失败
java.lang.UnsupportedOperationException: No Encoder found for SomeOtherClass
- field (class: "SomeOtherClass", name: "bar")
- root class: "FooWithSomeOtherClass"

为什么编码器不在范围内或至少不在正确的范围内?

此外,尝试指定一个显式编码器,例如:

df3.map(d => {FooWithSomeOtherClass(d.a, d.b, new SomeOtherClass(d.bar))}, (Int, String, Encoders.kryo[SomeOtherClass]))

无效。

发生这种情况是因为您应该在整个序列化堆栈中使用 Kryo 编码器,这意味着您的顶级对象应该具有 Kryo 编码器。以下在本地 Spark shell 上成功运行(您感兴趣的更改在第一行):

  implicit val topLevelObjectEncoder: Encoder[FooWithSomeOtherClass] = Encoders.kryo[FooWithSomeOtherClass]

  val df1 = Seq(Foo(1), Foo(2)).toDF

  val df2 = Seq(FooWithSomeOtherClass(1, "one", new SomeOtherClass(4))).toDS

  val df3 = Seq(FooWithoutOtherClass(1, "one", 1), FooWithoutOtherClass(2, "two", 2)).toDS
  df3.printSchema
  df3.show

  val df4 = df3.map(d => FooWithSomeOtherClass(d.a, d.b, new SomeOtherClass(d.bar)))
  df4.printSchema
  df4.show
  df4.collect