Search code examples
pythonmachine-learningredisnlpspacy

How to avoid reloading ML model every time when I call python script?


I have two files, file1.py which have ML model size of 1GB and file2.py which calls get_vec() method from file1 and receives vectors in return. ML model is being loaded everytime when file1 get_vec() method is called. This is where it is taking lots of time (around 10s) to load the model from disk.

I want to tell file1 somehow not to reload model every time but utilize loaded model from earlier calls.

Sample code is as follows

# File1.py

import spacy
nlp = spacy.load('model')

def get_vec(post):
    doc = nlp(post)
    return doc.vector

File2.py

from File1 import get_vec

df['vec'] = df['text'].apply(lambda x: get_vec(x))

So here, it is taking 10 to 12 seconds in each call. This seems small code but it is a part of a large project and I can not put both in the same file.

Update1:

I have done some research and came to know that I can use Redis to store model in cache first time it runs and thereafter I can read the model from cache directly. I tried it for testing with Redis as follows

import spacy
import redis

nlp = spacy.load('en_core_web_lg')
r = redis.Redis(host = 'localhost', port = 6379, db = 0)
r.set('nlp', nlp)

It throws an error

DataError: Invalid input of type: 'English'. Convert to a bytes, string, int or float first.

Seems, type(nlp) is English() and it need to convert in a suitable format. So I tried to use pickle as well to convert it. But again, pickle is taking lots of time in encoding and decoding. Is there anyway to store this in Redis?

Can anybody suggest me how can I make it faster? Thanks.


Solution

  • Heres how to do it

    Step 1) create a function in python and load your model in that function

    model=None
    def load_model():
    
        global model
        model = ResNet50(weights="imagenet")
    

    if you carefully observe first I assigned variable model to None. Then inside load_model function I loaded a model.

    Also I made sure the variable model is made global so that it can be accessed from outside this function. The intuition here is we load model object in a global variable. So that we can access this variable anywhere within the code.

    Now that we have our tools ready (i.e we can access the model from anywhere within this code ) lets freeze this model in your computers RAM. This is done by:

    if __name__ == "__main__":
        print(("* Loading Keras model and Flask starting server..."
            "please wait until server has fully started"))
        load_model()
        app.run()
    

    Now what's the use of freezing model in RAM without using it. So, to use it I use POST request in flask

    @app.route("/predict", methods=["POST"])
    def predict():
    
        if flask.request.method == "POST":
    
                output=model.predict(data)  #what you want to do with frozen model goes here
    

    So using this trick you can freeze model in RAM, access it using a global variable. and then use it in your code.