Search code examples
pythonalgorithmfeature-detectionleast-squares

Least squares for circle detection


I'm trying to perform a circle detection from a laser scan using least squares optimization over a subset of data points. Since the measurements are only obtained for a part of a circle, the least squares method returns faulty result, reporting a circle much closer to the laser than it actually is.

The outcome of the algorithm is shown in the picture. Scatter points indicate the laser measurements, circles are centered on the points returned by the algorithm. Gray semi-transparent shape indicates the robot taking a scan (lasers on the left and on the right of this shape).

I'm interested only in the local coordinates of the circle, which has known radius RR.

PS. I assume that the scan is separated into clusters (self.clusters[i] is one cluster), which are the lists of [x,y] laser points

def circle(x, scan):
    xc, yc = x
    f = sqrt((scan[:,0] - xc)**2 + (scan[:,1] - yc)**2) - RR
    return f


def optimize_detect_circles(self):

    centre = [1,1]

    for i in range(0, self.number_of_clusters):
        range_points = np.array(self.clusters[i])

        sol = optimize.root(circle, centre, args=(range_points), method='lm')
        self.circle_candidates.append(sol.x)
        print sol.x

Here’s the picture:

enter image description here


Solution

  • 1,1 is too far from the correct value; you most probably fall into some local optimum.

    Try starting from a point much closer to the real center. You find it by first fitting a straight line to your cluster; then separate the points into two halves, according to which half of the line they project onto; fit two lines next, each to one of your new two subclusters; and find an intersection point for the two perpendiculars at their middle points.

    This is predicated on your clusters being arcs not bigger than 180 degrees in span, which they seem like they are. If not, just repeat the subdivision, to get four chords instead of two.