Search code examples
pythonpytorchgpugoogle-colaboratory

Get total amount of free GPU memory and available using pytorch


I'm using google colab free Gpu's for experimentation and wanted to know how much GPU Memory available to play around, torch.cuda.memory_allocated() returns the current GPU memory occupied, but how do we determine total available memory using PyTorch.


Solution

  • In the recent version of PyTorch you can also use torch.cuda.mem_get_info:

    https://pytorch.org/docs/stable/generated/torch.cuda.mem_get_info.html#torch.cuda.mem_get_info

    torch.cuda.mem_get_info()
    

    It returns a tuple where the first element is the free memory usage and the second is the total available memory.