Search code examples
pythonnumpyscipyleast-squares

plane fitting with normalized coefficients


I am Trying to fit 3d-points to a plane in 2.5d/3d using scipy.optimize.leastsq.

I am trying to minimize the function: ax + by + c - z

When I add noise to the planes I'm generating, I am starting to get different results for (a,b,c), while the linear relationship between a and b remains correct.

My question: Is there a way to constrain a normalization of the fitting parameters?

I can normalize after the optimization and run another search for the last parameter but it feels inefficient and it causes big variations to parameter c, any recommendations?

Thanks!

Here are some different results I got for the same plane:

[ 12.88343415     6.7993803   4001.717 ] 

[ 14.52913549     7.44262692  3201.1523]

[ 4.37394650e+00   2.20546734e+00   9.56e+03]

[ 24.32259278    12.32581015 -2748.026]

After normalizing by sqrt(a2 +b2 +1):

[-0.97401694  0.20292819 -6.16468053] 
[-0.97527745  0.1976869  -2.46058884] 
[-0.97573799  0.19342358  5.42282738] 
[ -0.97894621   0.17992336  13.52561491] 
[ -0.97821728   0.17834579  24.5345626 ] 

code:

def least_squares(neighborhood,p0):
    """
    computes the least mean squares solution for points in neighborhood.
    p0 is the initial guess for (a,b,c)
    returns a,b,c for the local minima found.
    """
    if neighborhood.shape[0]<5:
        return None
    sol = leastsq(residuals, p0, args=(None, neighborhood.T))[0]
    return sol

def f_min(X, p):
    """
    plane function to minimize.
    """
    ab = p[0:2]
    distance = (ab*X[:2].T).sum(axis=1) + p[2] - X[2]
    return distance

def residuals(params, signal, X):
    """
    residuals for least mean squares
    """
    return f_min(X, params)



p0 = np.random.uniform(-50, 50, size=(3,1))
sol = least_squares(neighborhood,p0)

Solution

  • Here's one way: Given N X,Y,Z values you want to find a,b,c,d to minimise

    Q = Sum{ i | square( (a,b,c)*(X[i], Y[i], Z[i])' - d)}
    

    whatever value you choose for (a,b,c) the value of d that minimises this will be

    d = Sum{ i | (a,b,c)*(X[i], Y[i], Z[i])' }/N
      = (a,b,c)*(Xbar,Ybar,Zvar)
    

    where Xbar is the average of the X etc.

    Plugging this into the expression for Q, we get

    Q = Sum{ i | square( (a,b,c)*(x[i], y[i], x[i])')} 
      = (a,b,c)*Sum{ i | (x[i], y[i], x[i])' * (x[i], y[i], x[i])}*(a,b,c)'
      = (a,b,c)*M*(a,b,c)'
    

    where x[i] = X[i]-xbar, y[i]=Y[i]-ybar etc and

    M = Sum{ i | (x[i], y[i], x[i])' * (x[i], y[i], x[i])}
    

    The minimum of q for normalised (a,b,c) will be the smallest eigenvector of M and then (a,b,c) will be an eigenvector for that eigenvalue.

    So the procedure is:

    a/ compute the means xbar,ybar,zbar of the coordinates and subtract these from x[],y[],z[]

    b/ form the matrix M and diagonalise it

    c/ the eigenvector for the lowest eigenvalue of M gives you (a,b,c)

    d/ compute d via

    d = (a,b,c)*(xbar,ybar,zbar)'