Search code examples
pythontensorflowrandom-forest

KeyError: 'SimpleMLLoadModelFromPathWithHandle' while loading model


I have been trying to load my saved Random Forest model for my flask application, I referred to this tensorflow.org article but when I load I get this error:

FileNotFoundError:

Op type not registered 'SimpleMLLoadModelFromPathWithHandle' in binary running on b5d47309d41b. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.)tf.contrib.resamplershould be done before importing the graph, as contrib ops are lazily registered when the module is first accessed. You may be trying to load on a different device from the computational device. Consider setting theexperimental_io_deviceoption intf.saved_model.LoadOptionsto the io_device such as '/job:localhost'.

I used the following code to save:

model.save("/content/DSS_project/my_saved_model")

To load in another colab after uploading the saved model used the following code

loaded_model = keras.models.load_model('/content/DSS_project/my_saved_model')
loaded_model.compile(metrics=['accuracy'])

I ran the following code mentioned in the article above:

!saved_model_cli show --dir "/content/DSS_project/my_saved_model" --all

and got the same error as the above

To replicate the error here is my colab code :

colab file

By running all cells you can see the total error message

Thank you!


Solution

  • I found that importing tensorflow_decision_forests fixed the KeyError: 'SimpleMLLoadModelFromPathWithHandle' error.

    Here's an example that worked for me.

    import tensorflow as tf
    import tensorflow_decision_forests
    
    model = tf.keras.models.load_model('./saved_models/random_forest_model_1')
    model.summary()
    

    More info can be found in this thread:
    https://discuss.tensorflow.org/t/tensorflow-decision-forests-with-tfx-model-serving-and-evaluation/2137/3

    [This] error can be met when loading a TF-DF SavedModel in side binaries (e.g. the TFX evaluator [depending on the version of TFX] or the C++ TF Serving infrastructure). In this case, you will have to do the op injection:

    • In C++ binaries, the OP can be injected by adding a dependency to //tensorflow_decision_forests/tensorflow/ops/inference:kernel_and_op 12.
    • In Python binaries, the op can be injected by importing TF-DF (i.e. import tensorflow_decision_forests) before loading the model.