Search code examples
image-processingcomputer-visionlabview

Contour analysis in LabVIEW Vision Development


I have images with the same element. I want to detect contours of element on both images and compute contour distances.

For debug I'm drawing points which are taken as corresponding to visualize which points are taken to compute distances.

Unfortunately it seems that almost the same points are taken on template image as on target image. I thought that it should compute distances between corresponding points on two images. So if contour is rotated distance will be big.

My question is how are points choosen to compute distances? What is wrong with my code? LabVIEW documentation mentions nothing about the controls I use.

I'm adding vi to test it and check whether my code is ok or not -> Link

I'm adding no images as it's not a point to solve my case, but the point is to figure out how LabVIEW works.


Solution

  • Answer appeared on the topic referenced in comment. Link again http://forums.ni.com/t5/Machine-Vision/Contour-analysis/td-p/2138766

    To sum up and answer this question:

    Compute contour distance locates the template contour on target image using contour matching algorithm (based on Geometric Pattern Matching). Matching algorithm takes care of shift, rotation, scale and occlusion.Once the match is found there is refinement algorithm for accurate correspondance generation between template contour points and target contour points. After completing one to one correspondance, distance will be calculated.