Search code examples
pythonc#opencvcontour

Different return from Java and Python cv2.findContours


I'm translating a Python program which uses OpenCV to C# with Java bindings. I tested both Python and C# programs using the same image, and I realized the findContours method returns different contours between the 2 programs.

Python: _, contours, hierarchy = cv2.findContours(edges, cv2.RETR_TREE, cv2.CHAIN_APPROX_SIMPLE)

C#: Imgproc.FindContours(edges, contours, hierarchy, Imgproc.RetrTree, Imgproc.ChainApproxSimple);

For Python, I checked using len(contours), for C# contours.Count, and they return the values 94 and 106 respectively. I think this may be the cause to many discrepancies in my translated program, and I'm not sure why. What am I doing wrong here?

Add-on: The Canny method below is called before calling findContours. Anything before is just reading the image, and converting the image into a gray one, thus the grayImg variable.

C#: Imgproc.Canny(grayImg, edges, 100, 200, 3, false);

Python: edges = cv2.Canny(gray, 100, 200, apertureSize = 3)

I previously thought it was because of differing OpenCV versions, but I realized both are using OpenCV 3. Unless the FindContours method is different in 3.1 than 3.4, I'm back to square one again as I don't know the cause of the problem.


Solution

  • I've finally found the answer to this phenomenon. The python program is using OpenCV 3.4, while the Java bindings of my C# program is using the older OpenCV 3.1 . The FindContours method may be found in both versions, but apparently returns different values.