My label looks like this
label = [0, 1, 0, 0, 1, 0]
Meaning that classes 1, 4
are present on the matching sample input.
There is not a good reason to create a one-hot
encoded version of this and if you want to keep the output labels size to be exactly the same, which is 6
in your case, you can't do a one-hot
encoded version of it.
Where multi-label classification
is to be done, you can not (more appropriately should not) use softmax
as activation. Softmax
is good for the cases where only one of the output can be the truth value. So, in your case it is better to use sigmoid cross-entropy
.