Search code examples
pythontensorflowneural-networkconv-neural-networktensorflow-estimator

Why softmax cross entropy loss never gives a value of zero in tensorflow?


Im doing a neural network in tensorflow and Im using softmax_cross_entropy to calculate the loss, I'm doing tests and note that it never gives a value of zero, even if I compare the same values, this is my code

labels=[1,0,1,1]


with tf.Session() as sess:
    onehot_labels=tf.one_hot(indices=labels,depth=2)
    logits=[[0.,1.],[1.,0.],[0.,1.],[0.,1.]]
    print(sess.run(onehot_labels))
    loss=tf.losses.softmax_cross_entropy(onehot_labels=onehot_labels,logits=logits)
    print(sess.run(loss))

I obtain this

[[0. 1.]
 [1. 0.]
 [0. 1.]
 [0. 1.]]
0.31326166

Why is not zero??


Solution

  • Matias's post is correct. The following code gives the same result as your code

    labels=[1,0,1,1]
    
    with tf.Session() as sess:
        onehot_labels=tf.one_hot(indices=labels,depth=2)
        logits=[[0.,1.],[1.,0.],[0.,1.],[0.,1.]]
        print(sess.run(onehot_labels))
    
        probabilities = tf.nn.softmax(logits=logits)
        # cross entropy
        loss = -tf.reduce_sum(onehot_labels * tf.log(probabilities)) / len(labels)
    
        print(sess.run(loss))