Search code examples
pythontorch

How to activate GPU computing in Google colab?


I'm a biginner in torch and python,

I was experimenting with some codes in machine learning that I found online using Google COlab and I got the following error:

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-4-d4b0db6cedae> in <module>()
    295                         input_dropout=input_dropout, hidden_dropout1=hidden_dropout1,
    296                         hidden_dropout2= hidden_dropout2, label_smoothing= label_smoothing)
--> 297 experiment.train_and_eval()
    298 
    299 

2 frames
/usr/local/lib/python3.6/dist-packages/torch/cuda/__init__.py in _lazy_init()
    195                 "Cannot re-initialize CUDA in forked subprocess. " + msg)
    196         _check_driver()
--> 197         torch._C._cuda_init()
    198         _cudart = _load_cudart()
    199         _cudart.cudaGetErrorName.restype = ctypes.c_char_p

RuntimeError: cuda runtime error (100) : no CUDA-capable device is detected at /pytorch/aten/src/THC/THCGeneral.cpp:50

I understand that cude is for GPU processing? So how can I fix the problem? I was experimenting with codes in this link:


Solution

  • Have you tried the following?

    Go to Menu > Runtime > Change runtime.

    Change hardware acceleration to GPU.

    How to install CUDA in Google Colab GPU's