Search code examples
tensorflowmachine-learningdeep-learningbackpropagation

will zero loss affect back propagation update


Say I'm doing a standard DNN classification task, and I'm using the cross-entropy loss. After loss calculation, I apply a mask vector([0, 0, 0, 1, 1, ...] to the loss to set some of the loss to zero.
the question is how will Tensorflow handle this zero loss? Will it be involved in back propagation or not?


Solution

  • Yes, tensorflow will be able to handle this. The gradients leading to the masked loss values will then just be 0 because they did not influence the loss values.