I'm trying to speed up my image processing code. One factor I was trying was to create a CIImage
directly, like so:
CIImage* ciImageStrong;
if (_cachedData)
{
_cachedData = [NSData dataWithContentsOfFile:pathForResource];
ciImageStrong = [CIImage imageWithData:_cachedData];
}
else
{
ciImageStrong = [CIImage imageWithContentsOfURL:[NSURL fileURLWithPath:pathForResource]];
}
my problem is, that when using it with a standard @"CISourceOverCompositing"
filter, all images are drawn ADDITIVELY instead of normal alpha blend.
When I use the following code, it all works fine:
UIImage* uiImageStrong;
if (_cachedData)
{
_cachedData = [NSData dataWithContentsOfFile:pathForResource];
uiImageStrong = [UIImage imageWithData:_cachedData];
}
else
{
uiImageStrong = [UIImage imageWithContentsOfFile:pathForResource];
}
CIImage* ciImageStrong = [CIImage imageWithCGImage:uiImageStrong.CGImage];
I've tried loading it with the kCGColorSpaceModelRGB colorspace to no avail. Questions:
CIImage
directly?In the end I've leaned enough about CGImage
to know the first solution wouldn't have speed up the loading of images. [UIImage imageWithContentsOfFile:pathForResource]
is very fast and UIImage.CGImage
has no discernible performance cost whatsoever.