I have an UIImage that I snap from the camera.
I want to do some image manipulations so I transform it to a CIImage.
CIImage *ciImage = [CIImage imageWithCGImage:snappedImage.CGImage];
Next I do my thing, and then transform it back to a UIImage:
UIImage *image = [UIImage imageWithCIImage:ciImage];
If now I want to transform this to data to save it into Core Data for example:
NSData *data = UIImagePNGRepresentation(image);
the data ends up being nil on iOS 12. (works on iOS 13, same thing happens with if I try to use the UIImageJPEGRepresentation)
Any idea why and how to fix this?
I see a lot of problems emerging when using the [UIImage imageWithCIImage:]
API. You see, your CIImage
is not actually rendered when you call this method. The resulting UIImage
is still just a "recipe" for how to create the result. It's up to the user of the UIImage
to know that and trigger the rendering if needed. UIImageView
s seem to know that, but I saw a lot of inconsistencies across other APIs.
What you can do instead is to use a CIContext
and perform the rendering yourself explicitly:
// ideally you create this context once and re-use it because it's an expensive object
CIContext* context = [CIContext context];
CGColorSpaceRef colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceSRGB);
NSData* data = [context PNGRepresentationOfImage:ciImage format:kCIFormatBGRA8 colorSpace:colorSpace options:nil];
CGColorSpaceRelease(colorSpace);
This should work consistently in any iOS version.