Search code examples
pythonflaskkeras

ModuleNotFoundError: No module named 'keras.saving.pickle_utils'


When I load pickle file like this, it warns me no library imported such keras.saving.pickle_utils. However, this code can run on Google Colab. I don't know what my bug is. Can you help me? I use tensorflow 2.13.0

This is my source code for flask back-end server

from flask import Flask,request,jsonify
import numpy as np
import pickle
import joblib


app = Flask(__name__)
file= open('model.pkl','rb')
model = joblib.load(file)
@app.route('/')
def index():
return "Hello world"

@app.route('/predict',methods=['POST'])
def predict():
N = float(request.form.get('N',False))
P = float(request.form.get('P',False))
K = float(request.form.get('K',False))
temperature = float(request.form.get('temperature',False))
humidity = float(request.form.get('humidity',False))
ph=float(request.form.get('ph',False))
rainfall=float(request.form.get('rainfall',False))
input_query = np.array([[N,P,K,temperature,humidity,ph,rainfall]])

result = model.predict(input_query)[0]

return jsonify({'placement':str(result)})

if __name__ == '__main__':
app.run(debug=True)

And my bug is:

Traceback (most recent call last):
File "C:\Users\Admin\a.py", line 9, in <module>
model = joblib.load(file)
            ^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\Python\Python311\Lib\site-packages\joblib\numpy_pickle.py", line 648, in load
obj = _unpickle(fobj)
          ^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\Python\Python311\Lib\site-packages\joblib\numpy_pickle.py", line 577, in _unpickle
obj = unpickler.load()
          ^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\Python\Python311\Lib\pickle.py", line 1213, in load
dispatch[key[0]](self)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python311\Lib\pickle.py", line 1538, in load_stack_global
self.append(self.find_class(module, name))
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\Python\Python311\Lib\pickle.py", line 1580, in find_class
__import__(module, level=0)
ModuleNotFoundError: No module named 'keras.saving.pickle_utils'

Solution

  • I have encountered exact same issue recently:

    My experience: I trained my model from

    tensorflow.keras.models using Sequential, Dropout, BatchNormalization etc.

    After getting my desired result, I went ahead to save the model to disk using joblib. the model was saved quite alright, but when I was to deserialize the model from API for prediction purposes, I encountered the exact error

    ModuleNotFoundError: No module named 'keras.saving.pickle_utils'

    After much research and reading the keras documentation .

    I was able to solve the issue by following the instruction from the keras documentation.

    Solutions:

    Thus for this error, if your model was built from tensorflow.keras.models, according to their documentation, you have to save the model with the keras function such as

    model.save('path/to/location.keras')
    

    Note: # The file needs to end with the .keras extension unlike joblib and pickle function where you can give the saved model any extension.

    A keras model saved this way can be pickled using same keras function such as

    model = keras.models.load_model('path/to/location.keras')
    

    if you have imported the names space like:

    import tensorflow as tf
    

    Then you can load your model by calling the function this way:

    model = tf.keras.models.load_model('path/to/location.keras')
    

    Further research into the reason for this reveals that a Keras model consists of multiple components:

    • The architecture, or configuration, which specifies what layers the model contain, and how they're connected.
    • A set of weights values (the "state of the model").
    • An optimizer (defined by compiling the model).
    • A set of losses and metrics (defined by compiling the model).

    The Keras API saves all of these pieces together in a unified format, marked by the .keras extension. This is a zip archive consisting of the following:

    1. A JSON-based configuration file (config.json): Records of model, layer, and other trackables' configuration.
    2. A H5-based state file, such as model.weights.h5 (for the whole model), with directory keys for layers and their weights.
    3. A metadata file in JSON, storing things such as the current Keras version.

    You can get a much more comprehensive guide from Keras