Search code examples
deep-learningpytorchgpugoogle-colaboratory

Will gpu be still used for training if I don't transfer tensor and model to gpu using to(device)?


I am using google colab, and I need to know if it uses any GPU for training if I don't do model.to('cuda') and data.to('cuda')?


Solution

  • If you do not use model.to(torch.device('cuda')) and data.to(torch.device('cuda')) your model and all of your tensors will be remaining on default device which is CPU, so they do not understand the existence of GPU. PyTorch uses CPU for its work.

    You can see this Link for more information about torch.device.