Search code examples
tensorflowmachine-learningloss-functionmulticlass-classification

Tensorflow loss calculation for multiple positive classifications


My label looks like this

label = [0, 1, 0, 0, 1, 0]

Meaning that classes 1, 4 are present on the matching sample input.

  1. How do I create one-hot encoded labels for a label like that?
  2. Which loss function is more appropriate for a case like this (sigmoid cross entropy, softmax cross entropy, or sparse softmax cross entropy)?

Solution

    1. There is not a good reason to create a one-hot encoded version of this and if you want to keep the output labels size to be exactly the same, which is 6 in your case, you can't do a one-hot encoded version of it.

    2. Where multi-label classification is to be done, you can not (more appropriately should not) use softmax as activation. Softmax is good for the cases where only one of the output can be the truth value. So, in your case it is better to use sigmoid cross-entropy.