Search code examples
pythondeep-learningkerastraining-datasemantic-segmentation

How to train in Keras when I don't have enough memory for loading all training data


I want to train my model in Keras, so tried to load images as numpy array and resizing them, but it failed because I don't have enough memory.

MemoryError when I normalize images by img/255

My task is semantic segmentation. I have two folders. One is for the input images and the other is for desired output images. The corresponding images have the same name.

Are there useful API in Keras?


Solution

  • Yes, you should use a generator and the fit_generator function to train with it. Basically in a generator function you have a lot of freedom on how to load the data and in which quantities, so you can load data while the model is training, and only keep one batch of data in memory at a time (plus a queue used by Keras).