Search code examples
pythonnumpyderivative

Compute the Jacobian matrix in Python


import numpy as np


a = np.array([[1,2,3],
              [4,5,6],
              [7,8,9]])


b = np.array([[1,2,3]]).T

c = a.dot(b) #function

jacobian = a # as partial derivative of c w.r.t to b is a.

I am reading about jacobian Matrix, trying to build one and from what I have read so far, this python code should be considered as jacobian. Am I understanding this right?


Solution

  • You can use the Harvard autograd library (link), where grad and jacobian take a function as their argument:

    import autograd.numpy as np
    from autograd import grad, jacobian
    
    x = np.array([5,3], dtype=float)
    
    def cost(x):
        return x[0]**2 / x[1] - np.log(x[1])
    
    gradient_cost = grad(cost)
    jacobian_cost = jacobian(cost)
    
    gradient_cost(x)
    jacobian_cost(np.array([x,x,x]))
    

    Otherwise, you could use the jacobian method available for matrices in sympy:

    from sympy import sin, cos, Matrix
    from sympy.abc import rho, phi
    
    X = Matrix([rho*cos(phi), rho*sin(phi), rho**2])
    Y = Matrix([rho, phi])
    
    X.jacobian(Y)
    

    Also, you may also be interested to see this low-level variant (link). MATLAB provides nice documentation on its jacobian function here.

    UPDATE: Note that the autograd library has since been rolled into jax, which provides functions for computing forward and inverse Jacobian matrices (link).