Search code examples
drake

Autodiff for Jacobian derivative with respect to individual joint angles


I am trying to compute $\partial{J}{q_i}$ in drake C++ for manipulator and as per my search, the best approach seems to be using autodiff function. I was not able to fully understand autodiff approach from the resources that I found, so I apologize if my approach is not clear enough. I have used my understanding from some already asked questions mentioned on the forum regarding auto diff as well as https://drake.mit.edu/doxygen_cxx/classdrake_1_1multibody_1_1_multibody_plant.html as reference.

As I want to calculate $\partial{J}{q_i}$, the return type will be a tensor i.e. 3 * 7 * 7(or 6 * 7 * 7 depending on the spatial jacobian). I can think of using std::vectorEigen::MatrixXd to allocate the output or alternatively just doing one $q_i$ at a time and computing the respective jacobian for the auto diff. In either case, I was struggling to pass it in the initializing the jacobian function. I did the following to initialize autodiff

std::unique_ptr<multibody::MultibodyPlant<AutoDiffXd>> mplant_autodiff = systems::System<double>::ToAutoDiffXd(mplant);
std::unique_ptr<systems::Context<AutoDiffXd>> mContext_autodiff = mplant_autodiff->CreateDefaultContext();

mContext_autodiff->SetTimeStateAndParametersFrom(*mContext);
const multibody::Frame<AutoDiffXd>* mFrame_EE_autodiff = &mplant_autodiff->GetBodyByName(mEE_link).body_frame();
const multibody::Frame<AutoDiffXd>* mWorld_Frame_autodiff = &(mplant_autodiff->world_frame());

//Initialize the q as autodiff vector
drake::AutoDiffVecXd q_autodiff = drake::math::InitializeAutoDiff(mq_robot);
MatrixX<AutoDiffXd> mJacobian_autodiff; // Linear Jacobian matrix.

mplant_autodiff->SetPositions(context_autodiff.get(), q_autodiff);

mplant_autodiff->CalcJacobianTranslationalVelocity(*mContext_autodiff,
                                      multibody::JacobianWrtVariable::kQDot,
                                      *mFrame_EE_autodiff,
                                      Eigen::Vector3d::Zero(),
                                      *mWorld_Frame_autodiff,
                                      *mWorld_Frame_autodiff,
                                      &mJacobian_autodiff 
                                      );

However, as far as I understand, InitializeAutoDiff initializes to the identity matrix, whereas I want to $\partial{J}{q_i}$, so is there is a better way to do it. In addition, I get error messages when I try to call the jacobian matrix. Is there a way to address this problem both for $\partial{J}{q_i}$ for each q_i and changing q_i in a for loop or directly getting the result in a tensor. My apologies if I am doing something total tangent to the correct approach. I thank you in anticipation.


Solution

  • However, as far as I understand, InitializeAutoDiff initializes to the identity matrix, whereas I want to $\partial{J}{q_i}$, so is there is a better way to do it

    That is correct. When you call InitializeAutoDiff and compute mJacobian_autodiff, you get a matrix of AutoDiffXd. Each AutoDiffXd has a value() function that stores the double value, and a derivatives() storing the gradient as an Eigen::VectorXd. We have

    mJacobian(i, j).value() = J(i, j)
    mJacobian_autodiff(i, j).derivatives()(k) = ∂J(i, j)/∂q(k)
    

    So if you want to create a std::vecot<Eigen::MatrixXd> such that the k'th entry of this vector stores the matrix ∂J/∂q(k), then here is a code

    std::vector<Eigen::MatrixXd> dJdq(q_autodiff.rows());
    for (int i = 0; i < q_autodiff.rows(); ++i) {
      dJdq[i].resize(mJacobian_autodiff.rows(), mJacobian_autodiff.cols());
    }
    for (int i = 0; i < q_autodiff.rows(); ++i) {
      // dJidq stores the gradient of the ∂J.col(i)/∂q, namely dJidq(j, k) = ∂J(j, i)/∂q(k)
      auto dJidq = ExtractGradient(mJacobian_autodiff.col(i));
      for (int j = 0; j < static_cast<int>(dJdq.size()); ++j) {
        dJdq[j].col(i) = dJidq.col(j);
      }
    }
    

    Compute ∂J/∂q(i) for a single i

    If you do not want to compute ∂J/∂q(i) for all i, but only for one specific i, you can change the initialization of q_autodiff from InitializeAutoDiff to this

    AutoDiffVecXd q_autodiff(q.rows());
    for (int k = 0; k < q_autodiff.rows(); ++k) {
      q_autodiff(k).value() = q(k)
      q_autodiff(k).derivatives() = Vector1d::Zero();
      if (k == i) {
        q_autodiff(k).derivatives()(0) = 1;
      }
    }
    

    namely q_autodiff stores the gradient ∂q/∂q(i), which is 0 for all k != i and 1 when k == i. And then you can compute mJacobian_autodiff using your current code. Now mJacobian_autodiff(m, n).derivatives() store the gradient of ∂J(m, m)/∂q(i) for that specific i. You can extract this gradient as

    Eigen::Matrix dJdqi(mJacobian_autodiff.rows(), mJacobian_autodiff.cols());
    for (int m = 0; m < dJdqi.rows(); ++m) {
      for (int n = 0; n < dJdqi.cols(); ++n) {
        dJdqi(m, n) = mJacobian_autodiff(m, n).derivatives()(0);
      }
    }