Search code examples
tensorflowsoftmaxcross-entropy

why softmax_cross_entropy_with_logits_v2 return cost even same value


i have tested "softmax_cross_entropy_with_logits_v2" with a random number

import tensorflow as tf

x = tf.placeholder(tf.float32,shape=[None,5])
y = tf.placeholder(tf.float32,shape=[None,5])
softmax = tf.nn.softmax_cross_entropy_with_logits_v2(logits=x,labels=y)

with tf.Session() as sess:
    feedx=[[0.1,0.2,0.3,0.4,0.5],[0.,0.,0.,0.,1.]]
    feedy=[[1.,0.,0.,0.,0.],[0.,0.,0.,0.,1.]]
    softmax = sess.run(softmax, feed_dict={x:feedx, y:feedy})
    print("softmax", softmax)

console "softmax [1.8194163 0.9048325]"

what i understand about this function was This function only returns cost when logits and labels are different.

then why it returns 0.9048325 even same value?


Solution

  • The way tf.nn.softmax_cross_entropy_with_logits_v2 works is that it does softmax on your x array to turn the array into probabilities:

    enter image description here

    where i is the index of your array. Then the output of tf.nn.softmax_cross_entropy_with_logits_v2 will be the dotproduct between -log(p) and the labels:

    enter image description here

    Since the labels are either 0 or 1, only the term where the label is equal to one contributes. So in your first sample, the softmax probability of the first index is

    enter image description here

    and the output will be

    enter image description here

    Your second sample will be different, since x[0] is different than x[1].