So, I am trying to write a custom loss function for my keras model. The loss function needs a global variable which changes after every epoch to calculate the loss, But I am not able to get the dynamic loss. with tf.print() it prints the one static value.. so, can anyone point to some resource/solution so that I can use a global variable in the loss function which changes after every epoch. Thank you.
I'm also having the same inquiry. I found multiple references that can address the mentioned issue. However, I'm still struggling in implementing it with my own code.
Solution One (recommended) : Adding a custom loss function by using model.add_loss. For references, you can check these sources:
Solution Two: subclassing the loss function (might not be a good solution for your case)
How to custom losses by subclass tf.keras.losses.Loss class in Tensorflow2.x
def loss_carrier(extra_param1, extra_param2):
def loss(y_true, y_pred):
#x = complicated math involving extra_param1, extraparam2, y_true, y_pred
#remember to use tensor objects, so for example keras.sum, keras.square, keras.mean
#also remember that if extra_param1, extra_maram2 are variable tensors instead of simple floats,
#you need to have them defined as inputs=(main,extra_param1, extraparam2) in your keras.model instantiation.
#and have them defind as keras.Input or tf.placeholder with the right shape.
return x
return loss
model.compile(optimizer='adam', loss=loss_carrier