Search code examples
mxnet

Adding loss functions in MxNet - "Operator _copyto is non-differentiable because it didn't register FGradient attribute"


I have a system that generates training data, and I want add loss functions together to get a batch size. I am trying to do (full code at commit in question),

for epoch in range(100):
    with mx.autograd.record():
        loss = 0.0
        for k in range(40):
            (i, x), (j, y) = random.choice(data), random.choice(data)
            # Just compute loss on last output
            if i == j:
                loss = loss - l2loss(net(mx.nd.array(x)), net(mx.nd.array(y)))
            else:
                loss = loss + l2loss(net(mx.nd.array(x)), net(mx.nd.array(y)))
        loss.backward()
    trainer.step(BATCH_SIZE)

But I get an error like,

---------------------------------------------------------------------------
MXNetError                                Traceback (most recent call last)
<ipython-input-39-14981406278a> in <module>()
     21             else:
     22                 loss = loss + l2loss(net(mx.nd.array(x)), net(mx.nd.array(y)))
---> 23         loss.backward()
     24     trainer.step(BATCH_SIZE)
     25     avg_loss += mx.nd.mean(loss).asscalar()

... More trace ...

MXNetError: [16:52:49] src/pass/gradient.cc:187: Operator _copyto is non-differentiable because it didn't register FGradient attribute.

How do I incrementally add loss functions like I am trying to?


Solution

  • What version of MXNet are you using? I couldn't reproduce this using the latest code base. You can try either GitHub master branch or version 0.12.