Search code examples
scalaapache-sparkamazon-s3kryo

Spark: save and load machine learning model on s3


I would like to save and load machine learning model on s3.

I did it:

val credentials = new ProfileCredentialsProvider()
val hadoopConf = sc.hadoopConfiguration
hadoopConf.set("fs.s3.impl", "org.apache.hadoop.fs.s3native.NativeS3FileSystem")
hadoopConf.set("fs.s3.awsAccessKeyId", credentials.getCredentials.getAWSAccessKeyId)
hadoopConf.set("fs.s3.awsSecretAccessKey", credentials.getCredentials.getAWSSecretKey)

TrainValidationSplitModel.load(s"s3://model_path")

And it's working when I run it in local.

However, when I run it in a cluster I got the following error :

Serialization trace:
fields (org.apache.spark.sql.types.StructType)
at com.esotericsoftware.kryo.serializers.ObjectField.write(ObjectField.java:101)
at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:518)
at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:628)
at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:366)
at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:307)
at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:628)
at org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:312)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:324)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

Caused by: java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.sql.types.StructField[]
Note: To register this class use: kryo.register(org.apache.spark.sql.types.StructField[].class);
at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)
at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)
at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)
at com.esotericsoftware.kryo.serializers.ObjectField.write(ObjectField.java:76)
... 10 more

You will probably say: "Easy, you just have to register the class org.apache.spark.sql.types.StructField using kryo.register(SomeClass.class);"

But, after almost fifteen classes registrations. Kryo ask me to register a class which is private (access is retricted to the spark package).

How can I solve this problem ?


Solution

  • The error has nothing to do with saving and loading the model.

    It is caused by spark.kryo.registrationRequired, set somewhere in your configuration to true. If it is, it behaves as follows

    Whether to require registration with Kryo. If set to 'true', Kryo will throw an exception if an unregistered class is serialized. If set to false (the default), Kryo will write unregistered class names along with each object. Writing class names can cause significant performance overhead, so enabling this option can enforce strictly that a user has not omitted classes from registration.

    My personal advice it to just use it for diagnostics and disable when you actually run the application.