Search code examples
iosobjective-cuiimagecropface-detection

iOS - Issues cropping an image to a detected face


I am trying to crop a UIImage to a face that has been detected using the built-in CoreImage face detection functionality. I seem to be able to detect the face properly, but when I attempt to crop my UIImage to the bounds of the face, it is nowhere near correct. My face detection code looks like this:

-(NSArray *)facesForImage:(UIImage *)image {
    CIImage *ciImage = [CIImage imageWithCGImage:image.CGImage];

    CIContext *context = [CIContext contextWithOptions:nil];
    NSDictionary *opts = @{CIDetectorAccuracy : CIDetectorAccuracyHigh};

    CIDetector *detector = [CIDetector detectorOfType:CIDetectorTypeFace context:context options:opts];
    NSArray *features = [detector featuresInImage:ciImage];

    return features;
}

...and the code to crop the image looks like this:

-(UIImage *)imageCroppedToFaceAtIndex:(NSInteger)index forImage:(UIImage *)image {
    NSArray *faces = [self facesForImage:image];
    if((index < 0) || (index >= faces.count)) {
        DDLogError(@"Invalid face index provided");

        return nil;
    }

    CIFaceFeature *face = [faces objectAtIndex:index];
    CGRect faceBounds = face.bounds;

    CGImageRef imageRef = CGImageCreateWithImageInRect(image.CGImage, faceBounds);
    UIImage *croppedImage = [UIImage imageWithCGImage:imageRef];

    return croppedImage;
}

I have an image with only 1 face in it I'm using for testing, and it appears to detect it with no problem. But the crop is way off. Any idea what could be the problem with this code?


Solution

  • For anyone else having a similar issue -- transforming CGImage coordinates to UIImage coordinates -- I found this great article explaining how to use CGAffineTransform to accomplish exactly what I was looking for.