Search code examples
kerasdeep-learningautomatic-differentiation

How to get access to the partial derivatives of output with respect to inputs in deep learning model?


I want to create my own loss function in keras, which contains derivatives. For example,

def my_loss(x):
    def y_loss(y_true,y_pred):
        res = K.gradients(y_pred,x)
        return res
    return y_loss

is defined, and

model = Sequential()
model.add(Dense(10, input_dim=2, activation='sigmoid'))
model.add(Dense(1, activation='linear'))
model_loss = my_loss(x=model.input)
model.compile(loss=model_loss, optimizer='adam')

Now because the input is of two-dimensional,

K.gradients(y_pred,x)

must be a two-dimensional vector. However, I don't know how to get each scalars in the gradients. What I finally want is all the second derivatives of y_pred with respect to x. Is there a convenient way to get this?


It is similar to this post, but this post separated two-dimensional variables into two one-dimensional variables. Is there any other way to get gradients without separating inputs?


Solution

  • Unfortunately, Keras does not have a convenient way to get each components of the gradients. Therefore, I used tensorflow to resolved this problem.

    if f if the object function with variable x=(x1,x2)

    X=tf.placeholder(tf.float32,shape=(None,2))
    f=f(X)#assume it is defined'
    

    then df/dx_1 is

    tf.gradients(f,x)[0][:,0]
    

    df/dx_2 is

    tf.gradients(f,x)[0][:,1]
    

    d^2f/dx_1^2 is

    tf.gradietns(tf.gradients(f,x))[0][:,0]
    

    d^2f/dx_2^2 is

    tf.gradietns(tf.gradients(f,x))[0][:,1]
    

    d^2f/dx_1dx_2 is

    tf.gradietns(tf.gradients(f,x)[0][:,0])[0][:,1]
    

    I believe there is a better way, but I can't find.