I have a python-based face recognition script running several processes (threads?) all doing different things. I am attempting to use one of these to re-train the model once the training images have been changed/updated.
I have tried sending the model through the python pipe function:
pipe.send(model)
I am not hit with any exceptions, it just hangs there indefinitely.
I fear that either the model is either unpicklable, or simply just too big!
multiprocessing
uses pickle
(or cPickle
, depending on the version). Have you tried checking like this?
>>> import pickle
>>> pik = pickle.dumps(model)
>>> _model = pickle.loads(pik)
If that succeeds, it's serializable by pickle
. If it's not, you might try using a more powerful serializer, and a fork of multiprocessing
that utilizes said better serializer (i.e. dill
and pathos.multiprocessing
).