I've had code for over a year that successfully scales, rotates and converts a UIImage to 1bpp (black and white). It works fine on my iPhone 5s, iPads I've tested, etc. However, my iPhone 4 running iOS 7.1.2 crashes.
My code (a category on UIImage) is:
- (void) prepareImageForOCRWithCompletionHandler:(void (^) (UIImage *image))completion {
if (!completion) return; // no sense in doing anything if nobody listens for the completion
int width = self.size.width;
int height = self.size.height;
DLog(@"width=%d, height=%d", width, height);
if (width > height) {
int temp = width;
width = height;
height = temp;
}
int scaledWidth = 1728; // class F tiffs should be 1728, 2048 or 2483 pixels wide http://www.libtiff.org/support.html
int scaledHeight = scaledWidth * height / width;
DLog(@"width=%d, height=%d, scaledWidth=%d, scaledHeight=%d", width, height, scaledWidth, scaledHeight);
GPUImagePicture *source = [[GPUImagePicture alloc] initWithImage:self];
GPUImageFilter *rotateFilter = [[GPUImageFilter alloc] init];
[rotateFilter setInputRotation:kGPUImageRotateRight atIndex:0];
GPUImageTransformFilter *scaleFilter = [[GPUImageTransformFilter alloc] init];
[scaleFilter forceProcessingAtSizeRespectingAspectRatio:CGSizeMake(scaledWidth, scaledHeight)];
GPUImageAdaptiveThresholdFilter *thresholdFilter = [[GPUImageAdaptiveThresholdFilter alloc] init];
[source addTarget:rotateFilter];
[rotateFilter addTarget:scaleFilter];
[scaleFilter addTarget:thresholdFilter];
[source processImageWithCompletionHandler:^{
UIImage *result = [thresholdFilter imageFromCurrentlyProcessedOutput];
completion(result);
}];
}
The app crashes before the GPUImage completion handler is called with this message in the Xcode console:
2015-02-05 13:39:10.131 MyApp[345:60b] -[UIImage(BJM) prepareImageForOCRWithCompletionHandler:] [Line 54] width=1936, height=2592
2015-02-05 13:39:10.133 MyApp[345:60b] -[UIImage(BJM) prepareImageForOCRWithCompletionHandler:] [Line 65] width=1936, height=2592, scaledWidth=1728, scaledHeight=2313
2015-02-05 13:39:12.328 MyApp[345:7273] *** Assertion failure in -[GPUImageTransformFilter createFilterFBOofSize:], /Users/brian/repos/GPUImage/framework/Source/GPUImageFilter.m:380
2015-02-05 13:39:12.364 MyApp[345:7273] void uncaughtExceptionHandler(NSException *__strong) [Line 75] uncaught exception: Incomplete filter FBO: 36054
2015-02-05 13:39:12.366 MyApp[345:7273] *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Incomplete filter FBO: 36054'
*** First throw call stack:
(0x30560f83 0x3ad11ccf 0x30560e5d 0x30f0ed5b 0x2a7adf 0x2aa82d 0x2a73a7 0x2a7c77 0x2a7daf 0x2b0265 0x2a80d5 0x2a9285 0x2acd49 0x3b1f9833 0x3b1f9ded 0x3b1fa297 0x3b20c88d 0x3b20cb21 0x3b33bbd3 0x3b33ba98)
libc++abi.dylib: terminating with uncaught exception of type NSException
If I remove the GPUImageTransformFilter from the pipeline, it works. If I reinstate the transform filter but change the 1728 to 1000, it works too. This seems to be an issue that the original image is too small. However, the original is 1936x2592. I also changed my width and height ints to CGFloats thinking that a rounding error was causing the issue. No luck.
This is truly a humbling issue. Thanks for your time.
Your problem is related to image size. In the error readout, I see that you're using a height of 2592 in your image.
For iPhone 4 and older devices, their GPUs have a texture size limit of 2048 in either dimension. Attempting to create a texture larger than that in either width or height will result in a failure, something I throw an assertion on.
Unfortunately, I don't currently support images larger than the texture limits of whatever GPU you're working with. You'll need to find another way to resize these images larger than 2048x2048 on those older devices. I do this in a handful of places within the framework (to deal with photos taken on an iPhone 4, for example), but not everywhere.