I'm cropping UIImages with a UIBezierPath using UIGraphicsContext:
CGSize thumbnailSize = CGSizeMake(54.0f, 45.0f); // dimensions of UIBezierPath
UIGraphicsBeginImageContextWithOptions(thumbnailSize, NO, 0);
[path addClip];
[originalImage drawInRect:CGRectMake(0, originalImage.size.height/-3, thumbnailSize.width, originalImage.size.height)];
UIImage *maskedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
But for some reason my images are getting stretched vertically (everything looks slightly long and skinny), and this effect is stronger the bigger my originalImage
is. I'm sure the originalImages are perfectly fine before I do these operations (I've checked)
My images are all 9:16 (say 72px wide by 128px tall) if that matters.
I've seen UIGraphics creates a bitmap with an "ARGB 32-bit integer pixel format using host-byte order"; and I'll admit a bit of ignorance when it comes to pixel formats, but felt this MAY be relevant because I'm not sure if that's the same pixel format I use to encode the picture data.
No idea how relevant this is but here is the FULL processing pipeline: I'm capturing using AVFoundation and I set my photoSettings as
NSDictionary *photoSettings = @{AVVideoCodecKey : AVVideoCodecH264};
capturing using captureStillImageAsynchronouslyFromConnection:..
then turning it into NSData using [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
then downsizing into thumbnail by creating a CGDataProviderRefWithCFData
and converting to CGImageRef
using CGImageSourceCreateThumbnailAtIndex
and getting a UIImage from that.
Later, I once again turn it into NSData using UIImageJPEGRepresentation(thumbnail, 0.7)
so I can store. And finally when I'm ready to display I call my own method detailed on top [self maskImage:[UIImage imageWithData:imageData] toPath:_thumbnailPath]
and display it on a UIImageView and set contentMode = UIViewContentModeScaleAspectFit
.
If the method I'm using to mask the UIImage with the UIBezierPath is fine, I may end up explicitly setting the photoOutput settings with [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey, nil]
and the I can probably use something like how to convert a CVImageBufferRef to UIImage and change a lot of my code... but I really rather not do that unless completely necessary since, as I've mentioned, I really don't know much about video encoding / all these graphical, low level objects.
This line:
[originalImage drawInRect:CGRectMake(0, originalImage.size.height/-3, thumbnailSize.width, originalImage.size.height)];
is a problem. You are drawing originalImage
but you specify the width of thumbnailSize.width
and the height of originalImage
. This messes up the image's aspect ratio.
You need a width and a height based on the same image size. Pick one as needed to maintain the proper aspect ratio.