One of Drake's selling points is the easy availability of gradients via AutoDiff, but I'm struggling to see how to easily compute second-order derivatives in pydrake.
Given a function f(x), I know of two ways to compute the Jacobian. The first way uses the forwarddiff.jacobian
helper function, e.g.:
def f(x):
return x.T@x
x = np.array([1,2,3])
fx = jacobian(f,x) # = [2,4,6]
The second way uses the autodiffutils
bindings more directly:
x = InitializeAutoDiff([1,2,3])
y = f(x)
fx = ExtractGradient(y) # = [2,4,6]
Are there similar ways to get the Hessian? Nested calls to the jacobian
helper function don't work, since the second argument can't be an AutoDiffXd
type. But perhaps there is some way more analogous to the second method above?
The current recommended answer is to use symbolic::Expression
instead of AutoDiffXd
when you need more than one derivative. While all of our C++ code should work if it was compiled with AutoDiffX<AutoDiffXd>
to provide second derivatives, we currently don't build those as one of our default scalar types in libdrake.so
.