I would like to use the MDA (mean direction accuracy) as a custom loss function for a tensorflow neural network.
I am trying to implement this as described in here: Custom Mean Directional Accuracy loss function in Keras
def mda(y_true, y_pred):
s = K.equal(K.sign(y_true[1:] - y_true[:-1]),
K.sign(y_pred[1:] - y_pred[:-1]))
return K.mean(K.cast(s, K.floatx()))
The network works fine but when I try to fit my data I am getting this error:
ValueError: No gradients provided for any variable
I think that this is because I am loosing the gradient info from my pred tensor but I don't know how can implement this.... or if this makes any sense at all.... Finally I want to predict is if some numeric series is going up or down, that is why this function made sense to me.
The problem is that with K.equal
and K.cast
, you change numbers into bools. As a result, no gradient can be calculated.
You could replace them with a calculation; using the fact that when two numbers are equal, their difference is zero, and that since sign
returns only [-1, 0, 1], the absolute difference can only be 0, 1 or 2:
def mda(y_true, y_pred):
d = K.abs(K.sign(y_true[1:] - y_true[:-1]) - (K.sign(y_pred[1:] - y_pred[:-1])))
s = (1. - d) * (d - 1.) * (d - 2.) / 2.
return K.mean(s)
s
is equal 1
when your K.equal
is true, and 0
otherwise