Search code examples
swiftxcodeopencvarkitapple-vision

How to detect contours of object and describe it to compare on server with ARKit


I want to detect shape and then describe it (somehow) to compare it with server data.

So the first question is, is it possible to detect shape like blob with ARKit?

To be more specific, let's describe my usecase generally.

I want to scan image by phone, get the specific shape, send it on server, compare two images on server (server image is the real one, scanned image would be very similar) and then send back some data. I am not asking about server side, the only question about server side is what should I compare - images using OpenCV, some mathematical description of both images and try to find similarity, etc.).

If the question is hard to understand, let's split it on two easy questions: 1) How to scan 2D object by iPhone and save it (trim the specific shape from its background when object is black and background white). 2) Describe scanned object for comparision with almost the same object.


Solution

    • ARKit has no use here.
    • You will probably need a lot of CoreImage (for fixing perspective distortion and binarization) and OpenCV logic.
    • Perhaps Vision can help you a little bit with getting ROI from the entire frame, especially if the waveform image is located in some kind of rectangle.
    • Perhaps you can train a custom ML model that will recognize specific waveforms or waveforms in general to use with Vision.

    In any case, it is not a trivial task.