Search code examples
pythonnumpyscipylinear-algebraleast-squares

least square estimation in python using numpy and scipy


Both scipy and numpy has least square estimation but I got slightly confused after reading the documentation.

So, my problem is classical regression where I am trying to find the best matrix transformation between two pairs of values. So something of the form:

Ax = b

Here, I know x and b. A has 9 unique components and x and b are 3D vectors. So, I need at least 3 points. So, 3 non-collinear x and b and I have them and I can create a 3x3 concatenated x and b vectors.

However, from the documentation, I see that it is designed for solving systems when A and b are known. So we solve for x. Assuming A is invertible, this would mean something like:

x = A(-1)b          (-1) indicating the inverse or pseudo inverse.

In my case, the solution becomes

A = b x(-1)

I was wondering if I can still somehow use the built in numpy machinery for my setup.


Solution

  • Look at this:

    Ax = b
    x^TA^T = b^T
    

    where A^T indicates the transpose of A. Now define the symbols Ap=x^T and Xp = A^T and bp=b^T and your problem becomes:

    Ap Xp = bp
    

    that is exactly in the form that you can treat with least squares