Search code examples
pythontensorflowmachine-learningmemorygpflow

Clear up memory in python loop after creating a model


I am running a for loop in python where each loop is required to create a model on different data (an extract is shown below). The model created each time is not erased from memory resulting in slowing down each loop.

import gc
for s in range(0, 5):
  X, Y = get_data()
  m = make_dgp_model(X, Y, Z_100, L)
  del m
  gc.collect()
  print('memory: {}'.format(resource.getrusage(resource.RUSAGE_SELF).ru_maxrss / 1000000))

Giving output:

memory: 460.025856
memory: 470.310912
memory: 486.764544
memory: 493.457408
memory: 499.523584

I understand that python uses pointer reference for its memory and it does not rewrite something in memory when the variable, is reused. For that reason I tried del m and then using a garbage collector. Which doesn't seem to work. Am I doing something wrong? And is there a way I can completely delete what is stored in m once the loop is done?


Solution

  • The GPflow's readme page has a link to the nice tips and tricks notebook. There you can find an answer on your question in the item number one :)

    https://github.com/GPflow/GPflow/blob/develop/doc/source/notebooks/tips_and_tricks.ipynb