Search code examples
python-3.xtensorflowmachine-learningneural-networkcomputer-science

tf.nn.softmax_cross_entropy_with_logits how to use labels


For an assignment, I'm supposed to write a single layer neural network for one part of it. I think I got most of the stuff right, however when I tried using the tf.nn.softmax_cross_entropy_with_logits method, I got an error saying "ValueError: Both labels and logits must be provided." Which obviously means I need to provide both labels and logits, as I only provided logits in my code right now, so I understand what is wrong. What I don't understand is, what is labels and how I use them in this context? Keep in mind that I'm fairly new and inexperienced in tensorflow and neural networks in general. Thanks!


Solution

  • In supervised learning you have to give labels along with the training data and softmax_cross_entropy_with_logits calculates the softmax cross entropy between logits and labels. It helps to give the probability of a data being in a particular class. You can read more about it here https://www.tensorflow.org/api_docs/python/tf/nn/softmax_cross_entropy_with_logits

    h_fc1_drop = tf.nn.dropout(h_fc1, keep_prob)
    
    W_fc2 = weight_variable([1024, 10])
    b_fc2 = bias_variable([10])
    
    y_conv = tf.matmul(h_fc1_drop, W_fc2) + b_fc2    
    cross_entropy = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(labels=y_, logits=y_conv))
    

    I've given you a snippet of code from tensorflow tutorials wheresoftmax_cross_entropy_with_logits is used. Here y_ is a placeholder to which the labels are fed. Also softmax_cross_entropy_with_logits is currently deprecated.