Search code examples
pytorch

Pytorch gradient calulation of one Tensor


I'm a beginner in pytorch and I'm probably stuck on a relatively trivial problem, but it's not clearing up for me at the moment.

When calculating the gradient of a tensor I get a constant gradient of 1. Shouldn't the gradient of a constant however result in 0?

Here is a minimal example:

import torch
x = torch.tensor(50., requires_grad=True)

y = x

y.backward()

print(x.grad)
#Ouput: tensor(1.)

So why is the ouput 1 and not 0?


Solution

  • You are not computing the gradient of a constant, but that of the variable x which has a constant value 50. The derivative of x with respect to x is 1.