Search code examples
kerasloss

Implication of binary cross entropy loss value in Keras?


During training I saw that the binary cross entropy loss is positively unbounded. So can we interpret anything from just looking at the loss value alone, for example if the binary cross entropy loss is 0.5 does this mean that the model could only guess the correct result half of the time ?


Solution

  • The loss seen is a mean average of the loss. When you have one output sigmoid, with a batch size of 1, in my opinion, thats right. Having a greater batch size, makes this more complicated. One example:

    batch_size=4
    error_batch_1 = 0.4 #close
    error_batch_2 = 0.3 #close
    error_batch_3 = 0.3 #close
    error_batch_4 = 1 #far away
    

    When the average is computed, we get: 2/4 = 0.5

    When you look at the error that way, you would think that half of the predictions were correct, but in real, 3 of 4 were correct (implying, that the result is rounded to 1 or 0)