I want to create a custom loss function in Torch which is a modification of ClassNLLCriterion. Concretely, ClassNLLCriterion loss is:
loss(x, class) = -x[class]
I want to modify this to be:
loss(x, class) = -x[class]*K
where K
is a function of the network input, NOT the network weights or network output. Thus K
can be treated as a constant.
What is the easiest way of implementing this custom criterion? The updateOutput()
function seems straightforward, but how do I modify the updateGradInput()
function?
Basically your loss function L
is a function of the input and the target. So you have
loss(input, target) = ClassNLLCriterion(input, target) * K
if I understand correctly your new loss. Then you want to implement updateGradInput
which returns the derivative of your loss function with respect to the input, which is
updateGradInput[ClassNLLCriterion](input, target) * K + ClassNLLCriterion(input, target) * dK/dinput
Therefore you only have to compute the derivative of K wrt the input of the loss function (you did not give us the formula to compute K) and plug it into the previous line. Since your new loss function relies on ClassNLLCriterion
you can use the updateGradInput
and updateOutput
of this loss function to calculate yours.