The following two loops will use memory until I run out, but I can't figure out why. I am deleting all of the created variables at the end of each iteration and it still leaks.
!pip3 install cupy-cuda101
import cupy as cp
import numpy as np
from sklearn.preprocessing import PolynomialFeatures
xtrain = cp.asnumpy(cp.random.uniform(-1,1,size = (150000,50)))
for i in range(0,1000):
weights = cp.random.uniform(-1,1,size = (1275,1000))
for chunk in range(0,xtrain.shape[0],5000):
xchunk = xtrain[chunk:chunk+5000,:]
poly=PolynomialFeatures(interaction_only = True, include_bias = False)
xchunk = cp.array(poly.fit_transform(xchunk))
ranks = cp.matmul(xchunk,weights)
del ranks, xchunk, poly
del weights
xtrain is just float data as well, between -1 and 1.
these lines insert at the end of each iteration fixed it:
cp.get_default_memory_pool().free_all_blocks()
cp.get_default_pinned_memory_pool().free_all_blocks()