The legacy scipy.optimize.leastsq function returns a cov_x
parameter:
cov_x: ndarray
Uses the fjac and ipvt optional outputs to construct an estimate of the jacobian around the solution. None if a singular matrix encountered (indicates very flat curvature in some direction). This matrix must be multiplied by the residual variance to get the covariance of the parameter estimates – see curve_fit.
useful to estimate the variance of the parameter estimates.
What is the equivalent of this parameter in the new scipy.optimize.least_squares? There is:
jac : ndarray, sparse matrix or LinearOperator, shape (m, n)
Modified Jacobian matrix at the solution, in the sense that J^T J is a Gauss-Newton approximation of the Hessian of the cost function. The type is the same as the one used by the algorithm.
but it is not really equivalent.
I do not believe there is an obvious equivalent. jac
is not the same. It is an estimate of the Jacobian, a matrix of derivatives used to compute the gradients that optimize the minimum result.
You can perform least squares regression with curve_fit
, which will return a covariance matrix.
Returns:
popt
: array Optimal values for the parameters so that the sum of the squared residuals off(xdata, *popt) - ydata
is minimized
pcov
: 2d array The estimated covariance ofpopt
. The diagonals provide the variance of the parameter estimate. To compute one standard deviation errors on the parameters useperr = np.sqrt(np.diag(pcov))
. How the sigma parameter affects the estimated covariance depends on absolute_sigma argument, as described above. If the Jacobian matrix at the solution doesn’t have a full rank, then ‘lm’ method returns a matrix filled with np.inf, on the other hand ‘trf’ and ‘dogbox’ methods use Moore-Penrose pseudoinverse to compute the covariance matrix.
See also scipy.stats.lingress
, which also does least square and returns a correlation coefficient related to the covariance.