Search code examples
pythonkerasloss-function

Keras backend Custom Loss Function


I am Trying to calculate (tp+tn)/total_samples as my custom loss function. I know how to do this in list and list comprehension but there is no way i guess that i can convert y_true and y_pred to list.

The code I have written so far is:

def CustomLossFunction(y_true, y_pred):
   y_true_mask_less_zero = K.less(y_true, 0)
   y_true_mask_greater_zero = K.greater(y_true, 0)

   y_pred_mask_less_zero = K.less(y_pred, 0)
   y_pred_mask_greater_zero = K.greater(y_pred, 0)

   t_zeros = K.equal(y_pred_mask_less_zero, y_true_mask_less_zero)
   t_ones = K.equal(y_pred_mask_greater_zero, y_true_mask_greater_zero)

Now I need to sum total number of TRUES in t_zeros and t_ones and add them up and divide them by total samples

I got an error on this line:

sum_of_true_negatives = K.sum(t_zeros)

Value passed to parameter 'input' has DataType bool not in list of allowed values: float32, float64, int32, uint8, int16

Questions:

  • is there any built in loss function for "(tp+tn)/total_samples"
  • if not then how to calculate using Keras backend?

Solution

  • You must cast your boolean tensors to float before putting them into calculations.

    But a warning, so you don't waste your time:

    This loss function will not work because it's not differentiable. You can't simply discard the "continuity" existing in y_pred like that. (You will get errors like "None values not supported" or "An operation has None for gradient")

    Use some of the existing standard functions for classification, such as binary_crossentropy or categorical_crossentropy.

    Casting:

    t_zeros = K.cast(t_zeros, K.floatx())
    t_ones = K.cast(t_ones, K.floatx())