I am implementing a zooming feature in a camera app using AVFoundation
. I am scaling my preview view like this:
[videoPreviewView setTransform:CGAffineTransformMakeScale(cameraZoom, cameraZoom)];
Now, after I take a picture, I would like to zoom/crop the picture with the cameraZoom
value before I save it to the camera roll. How best should I do this?
Edit: Using Justin's answer:
CGRect imageRect = CGRectMake(0.0f, 0.0f, image.size.width, image.size.height);
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], imageRect);
CGContextRef bitmapContext = CGBitmapContextCreate(NULL, CGImageGetWidth(imageRef), CGImageGetHeight(imageRef), CGImageGetBitsPerComponent(imageRef), CGImageGetBytesPerRow(imageRef), CGImageGetColorSpace(imageRef), CGImageGetBitmapInfo(imageRef));
CGContextScaleCTM(bitmapContext, scale, scale);
CGContextDrawImage(bitmapContext, imageRect, imageRef);
CGImageRef zoomedCGImage = CGBitmapContextCreateImage(bitmapContext);
UIImage* zoomedImage = [[UIImage alloc] initWithCGImage:imageRef];
It is zooming the image, but it is not taking the center of it, but rather seems to be taking the top right area. (I'm not positive).
The other problem, (I should have been clearer in the OP), is that the image remains the same resolution, but I would rather just crop it down.
+ (UIImage*)croppedImageWithImage:(UIImage *)image zoom:(CGFloat)zoom
{
CGFloat zoomReciprocal = 1.0f / zoom;
CGPoint offset = CGPointMake(image.size.width * ((1.0f - zoomReciprocal) / 2.0f), image.size.height * ((1.0f - zoomReciprocal) / 2.0f));
CGRect croppedRect = CGRectMake(offset.x, offset.y, image.size.width * zoomReciprocal, image.size.height * zoomReciprocal);
CGImageRef croppedImageRef = CGImageCreateWithImageInRect([image CGImage], croppedRect);
UIImage* croppedImage = [[UIImage alloc] initWithCGImage:croppedImageRef scale:[image scale] orientation:[image imageOrientation]];
CGImageRelease(croppedImageRef);
return croppedImage;
}