I want to solve least-squares like system A^t * A * x = -A^t * x
. (I'm implementing Gauss-Newton method for special problem).
I wrote special routines which allow me to compute A * x
and A^t * y
products. With such routines it's easy to use matrix-free solvers thanks to Eigen.
But my approach converges not as good as Eigen::LeastSquaresConjugateGradient
. I made a small test and it's looks like LeastSquareDiagonalPreconditioner
speed ups convergence a lot.
My question is - how i can use LeastSquareDiagonalPreconditioner
or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.
EDIT
For clarity - i want to use Matrix-free solvers from Eigen with my product routines.
EDIT 2
Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.
The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner
and implementing something like LeastSquareDiagonalPreconditioner ::factorize()
but adapted to your type. Basically you need to compute:
m_invdiag(j) = 1./mat.col(j).squaredNorm();
for all column j
using a strategy to what you already implemented for the product operators.