Search code examples
pythonpytorch

"RuntimeError: a view of a leaf Variable that requires grad is being used in an in-place operation" when there's actually no in-place operations


I am working on some paper replication, but I am having trouble with it.

According to the log, it says that RuntimeError: a view of a leaf Variable that requires grad is being used in an in-place operation. However, when I check the line where the error is referring to, it was just a simple property setter inside the class:

@pdfvec.setter
def pdfvec(self, value):
   self.param.pdfvec[self.key] = value   # where the error message is referring to 

Isn't in-place operations are something like += or *= etc.? I don't see why this error message appeared in this line.

I am really confused about this message, and I will be glad if any one knows any possible reason this can happen.

For additional information, this is the part where the setter function was called:

def _update_params(params, pdfvecs):
    idx = 0
    for param in params:
        totdim = param.stats.numel()
        shape = param.stats.shape
        param.pdfvec = pdfvecs[idx: idx + totdim].reshape(shape)   # where the setter function was called
        idx += totdim

I know this can still lack information for solving the problem, but if you know any possiblity why the error message appeared I would be really glad to hear.


Solution

  • In-place operation means the assignment you've done is modifiying the underlying storage of your Tensor, of which requires_grad is set to True, according to your error message.

    That said, your param.pdfvec[self.key] is not a leaf Tensor, because they will be updated during back-propagation. And you tried to assign a value to it , that will interference with autograd, so this action is prohibited by default. You can do this by directly modifying its underlying storage(f.e., with .data).