Search code examples
pythontensorflowkerasloss-function

Custom loss with conditional return value


I want a loss function with this regularization: for each prediction, if the predicted point has norm lower than 0.9 or greater than 1, I want to apply the regularization.

So I wrote this:

def custom_loss(y_true, y_pred):

ret = keras.losses.mean_squared_error(y_true, y_pred)

n = tf.norm(y_pred, axis = 1)
intern_circle_distance = n - 0.9        
return tf.where(tf.logical_and(tf.greater(intern_circle_distance, 0),
                               tf.less(intern_circle_distance, 0.1))
                        ret,
                        ret*2)

When I use this in the model.compile, an error is returned:

Shapes must be equal rank, but are 0 and 1 for 'loss_71/Hyper_loss/Select' (op: 'Select') with input shapes: [?], [], [].

I tried the loss outside keras' enviroment and it seems to works. For example this:

a = tf.constant([[-1.0, 1.5]])
n = a - 1
K.eval(tf.where(tf.logical_and(tf.greater(n, 0)),
                               tf.less(n, 2)),
                a, a*2))

returns me the Tensor [-2., 1.5]

Why it works outside keras loss function and doesn't work inside keras loss function? How can it works inside keras loss function?


Solution

  • keras.losses.mean_squared_error gives you a scalar number, the mean of all the squared errors. If you want to change the error calculation per example, then do something like this:

    def custom_loss(y_true, y_pred):
        diff = tf.squared_difference(y_true, y_pred)
        n = tf.norm(y_pred, axis=1)
        intern_circle_distance = n - 0.9
        diff_reg = tf.where((intern_circle_distance > 0) & (intern_circle_distance <0.1))
                            diff, 2 * diff)
        return tf.reduce_mean(diff_reg)