Search code examples
iosswiftobject-detectionarkitcoreml

How to find distance between Pupil Center and Eyeglass Frame Edge


I am trying to measure the real-life distance between the center of the pupil and the top edge of the bottom of the eyeglasses frame, the user is wearing as shown in the photo:

pupil to eyeglass frame distance

I am using ARKIt, and using faceAnchor.leftEyeTransform & faceAnchor.rightEyeTransform, I am able to reliably get the center of the pupils individually.

I am struggling, however, with detecting the edge of the frame. All the references online point to detecting planes with ARKit which does not seem to apply in this case. This can definitely be done as I have seen iOS apps doing this.


Solution

  • Detecting the frames is not possible using ARKit alone. ARKit only tracks face's and face details. In order to detect the frames, you will need to use CoreML and some machine learning.

    More specifically you would want to use a model similar to Yolo V3 (https://towardsdatascience.com/training-a-yolov3-object-detection-model-with-a-custom-dataset-4981fa480af0) to detect the frames and tell you the bounding box of where in the image they exist. The lower horizontal bound would represent the bottom of the frame. Taking the distance from the pupil to the lower horizontal bounds would give you a a fairly close estimate of the distance from the pupil to the bottom of the glasses.