Search code examples
c++linear-algebraeigen

Reduced QR decomposition with eigen


I am trying to solve the classical least squares problem argmin (Ax - b)^2 using Eigen. I know that the system is overdetermined (i.e., m >= n where A is m x n) and that A has full rank (n). I have opted to use the sparse QR part of Eigen, since A is sparse itself. I got as far as this:

#include <Eigen/Sparse>
#include <Eigen/SparseQR>

void solve_least_squares(const Eigen::SparseMatrix<double>& matrix,
                         const Eigen::VectorXd& rhs)
{
  int m = matrix.rows();
  int n = matrix.cols();

  assert(m >= n);
  assert(matrix.isCompressed());

  Eigen::SparseQR<Eigen::SparseMatrix<double>, Eigen::COLAMDOrdering<int>> decomposition(matrix);

  Eigen::VectorXd t = (decomposition.matrixQ().transpose() * rhs).topRows(n);

  std::cout << t << std::endl;

  Eigen::VectorXd s = decomposition.matrixR().topRows(n).triangularView<Eigen::Upper>().solve(t);

  return decomposition.colsPermutation().inverse()*s;
}

But I am wondering if this is the most efficient implementation: Firstly, Eigen seems to determine the full QR decomposition rather than a reduced one (Q is m x m rather than m x n). This seems wasteful, since I only need the top n rows.

In the dense case, there is any example in the HouseholderQR class:

MatrixXf A(MatrixXf::Random(5,3)), thinQ(MatrixXf::Identity(5,3)), Q;
A.setRandom();
HouseholderQR<MatrixXf> qr(A);
Q = qr.householderQ();
thinQ = qr.householderQ() * thinQ; // <- here
std::cout << "The complete unitary matrix Q is:\n" << Q << "\n\n";
std::cout << "The thin matrix Q is:\n" << thinQ << "\n\n";

Here, a matrix multiplication is use to obtain the reduced Q, which seems even more inefficient than slicing the full matrix.

Since the (dense) SVD decomposition of Eigen provides an option to compute a thin SVD, is there a similar option for QR decompositions?


Solution

  • Inside the decomposition the Q factor is stored as a sequence of Householder vectors, and the matrixQ() method essentially returns a reference to that matrix (which overloads multiplication in a way "as if" multiplying by the actual matrix). When stored as Householder matrix, it makes no difference if Q represents the full or thin part of Q (actually, multiplying a vector by the full matrix can be done efficiently in-place).

    If you want to solve your linear system, you should just write

    Eigen::VectorXd s = decomposition.solve(rhs);
    

    Btw: If you are fine with losing some precision (and your matrix is sufficiently well conditioned), you are (in most cases) faster by solving the normal equation (Matlab notation):

    x = (A'*A) \ (A'*b) 
    

    and using a sparse Cholesky decomposition (e.g., Eigen::SimplicialLLT or Eigen::SimplicialLDLT). But benchmark that with your actual data, and check if the accuracy is sufficient for your use-case (maybe after a refinement step, which re-uses the Cholesky decomposition).