I am trying to create a confusion matrix for my test set. My test set consists of 3585 images. Whenever I try to run the following code:
x_test,y_test = next(iter(dataloader)))
y_pred = resnet(x_test)
Google colab crashes using all the available RAM. Does anyone have a work around for this? Should I do this in batches?
Should I do this in batches?
Yes! Try to reduce batch size.
dataloader = ... # reduce batch size here on dataloader creation
...
y_pred = []
for batch in dataloader:
batch_y_pred = resnet(batch)
y_pred.append(batch_y_pred)
I use list with append, you can try another way.