I am building an Autoencoder using Keras model. I want to built a custom loss in the form of alpha* L2(x, x_pred) + beta * L1(day_x, day_x_pred)
. The second term of L1 loss to penalize regarding to time (day_x is a day number). The day is the first feature in my input data.
my input data is of the form ['day', 'beta', 'sigma', 'gamma', 'mu']
.
the input x is of shape (batch_size, number of features) and I have 5 features.
So my question is how to extract the first feature from x and x_pred
to compute L1(t_x, t_x_pred)
.
This is my current loss function :
def loss_function(x, x_predicted):
#with tf.compat.v1.Session() as sess: print(x.eval())
return 0.7 * K.square(x- x_predicted) + 0.3 * K.abs(x[:,1]-x_predicted[:,1])
but this didn't work for me.
this is the loss you need...
you have to compute the means of your errors
def loss_function(x, x_predicted):
get_day_true = x[:,0] # get day column
get_day_pred = x_predicted[:,0] # get day column
day_loss = K.mean(K.abs(get_day_true - get_day_pred))
all_loss = K.mean(K.square(x - x_predicted))
return 0.7 * all_loss + 0.3 * day_loss
otherwise, you have to insert a dimensionality
def loss_function(x, x_predicted):
get_day_true = x[:,0] # get day column
get_day_pred = x_predicted[:,0] # get day column
day_loss = K.abs(get_day_true - get_day_pred)
all_loss = K.square(x - x_predicted)
return 0.7 * all_loss + 0.3 * tf.expand_dims(day_loss, axis=-1)
use the loss when you compile your model
model.compile('adam', loss=loss_function)