Search code examples
scalaapache-sparkapache-spark-sqlimplicit-conversionapache-spark-encoders

spark implicit encoder not found in scope


I have a problem with spark already outlined in spark custom kryo encoder not providing schema for UDF but created a minimal sample now: https://gist.github.com/geoHeil/dc9cfb8eca5c06fca01fc9fc03431b2f

class SomeOtherClass(foo: Int)
case class FooWithSomeOtherClass(a: Int, b: String, bar: SomeOtherClass)
case class FooWithoutOtherClass(a: Int, b: String, bar: Int)
case class Foo(a: Int)
implicit val someOtherClassEncoder: Encoder[SomeOtherClass] = Encoders.kryo[SomeOtherClass]
val df2 = Seq(FooWithSomeOtherClass(1, "one", new SomeOtherClass(4))).toDS
val df3 = Seq(FooWithoutOtherClass(1, "one", 1), FooWithoutOtherClass(2, "two", 2)).toDS
val df4 = df3.map(d => FooWithSomeOtherClass(d.a, d.b, new SomeOtherClass(d.bar)))

here, even the createDataSet statement fails due to

java.lang.UnsupportedOperationException: No Encoder found for SomeOtherClass
- field (class: "SomeOtherClass", name: "bar")
- root class: "FooWithSomeOtherClass"

Why is the encoder not in scope or at least not in the right scope?

Also, trying to specify an explicit encoder like:

df3.map(d => {FooWithSomeOtherClass(d.a, d.b, new SomeOtherClass(d.bar))}, (Int, String, Encoders.kryo[SomeOtherClass]))

does not work.


Solution

  • This happens because you should use the Kryo encoder through the whole serialization stack, meaning that your top-level object should have a Kryo encoder. The following runs successfully on a local Spark shell (the change you are interested in is on the first line):

      implicit val topLevelObjectEncoder: Encoder[FooWithSomeOtherClass] = Encoders.kryo[FooWithSomeOtherClass]
    
      val df1 = Seq(Foo(1), Foo(2)).toDF
    
      val df2 = Seq(FooWithSomeOtherClass(1, "one", new SomeOtherClass(4))).toDS
    
      val df3 = Seq(FooWithoutOtherClass(1, "one", 1), FooWithoutOtherClass(2, "two", 2)).toDS
      df3.printSchema
      df3.show
    
      val df4 = df3.map(d => FooWithSomeOtherClass(d.a, d.b, new SomeOtherClass(d.bar)))
      df4.printSchema
      df4.show
      df4.collect