Search code examples
objective-ccocoacore-imagecifacefeature

CIFaceDetector origin inaccuracy


I'm trying to use CIDetector/CIFaceDetector for basic face detection, and, while it seems to recognize faces correctly, the bounds rectangle is consistently inaccurate in everything I throw at it. Here's a sample, with the bounds it detects as a green box: https://i.sstatic.net/E0vkH.jpg

Everything seems to be just universally shifted down or mirrored by this amount. It's like the coordinates are coming from the bottom left rather than the top left. I've tried all eight CIDetectorImageOrientations on this image and they all return the same incorrect coordinates. What am I missing here? Here is the code:

NSDictionary *detectorOptions = @{ CIDetectorAccuracy : CIDetectorAccuracyHigh };
self.faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:detectorOptions];
self.features = [self.faceDetector featuresInImage:[CIImage imageWithCGImage:image.CGImage options:@{CIDetectorImageOrientation: @(1)}]];

That's really it. The image is a basic UIImage imageWithData: from the web.


Solution

  • You are working there with UIKit and Core Image. Each one of those frameworks uses different coordination system.

    • UIKit coordination system begins at top left corner
    • Core Image coordinates begin at bottom left

    You are probably drawing green rectangle with Core Image coordinates to UIKit context. Your code works as it should, you just need to convert the coordinates.

    You can also find a reference to it in iOS Dev Library

    See CoreImage and UIKit coordinates blog post for very neat way how to convert between these two systems.