I've been struggling with this issue for the past 3 days and i can't figure out why/how to solve my memory warnings/crashes due to image filtering.
It's that simple : The user can choose between different filters, and then save the picture.
I had different problems which i managed to overcome, and here is my final code (where i removed everything unimportant).
- (void)viewDidLoad{
// Setting up the filter chain, it's better to do it now to avoid lags
// if we had done it in the "filter" methods, about 3 sec lag.
stillImageSource = [[GPUImagePicture alloc]initWithImage:_photo];
GPUImageBrightnessFilter *soft;
GPUImageContrastFilter *medium;
soft = [[GPUImageBrightnessFilter alloc]init];
medium = [[GPUImageContrastFilter alloc]init];
soft.brightness = -0.5;
medium.contrast = 1.6;
}
- (IBAction)mediumFilter:(id)sender {
[stillImageSource addTarget:medium];
[medium useNextFrameForImageCapture];
[stillImageSource processImage];
//[medium endProcessing]; // Not sure what this does but i've seen it around.
[self.imgPicture setImage:[medium imageFromCurrentFramebuffer]];
[stillImageSource removeAllTargets];
}
- (IBAction)denseFilter:(id)sender {
[stillImageSource addTarget:soft];
[soft addTarget:medium];
[medium useNextFrameForImageCapture];
[stillImageSource processImage];
//[soft endProcessing];
//[medium endProcessing]; // Not sure what this does but i've seen it around.
[self.imgPicture setImage:[medium imageFromCurrentFramebuffer]];
[stillImageSource removeAllTargets];
}
- (IBAction)softFilter:(id)sender {
[stillImageSource addTarget:soft];
[soft useNextFrameForImageCapture];
[stillImageSource processImage];
//[soft endProcessing];
[self.imgPicture setImage:[soft imageFromCurrentFramebuffer]];
[stillImageSource removeAllTargets];
}
The user can freely clic on the different filters. If he uses one every 3 seconds, it only gives memory warnings (using an empty iPhone 4S with no running app). If he uses 3 in a row the app crashes with a memory warning.
The memory shown by Xcode is at about 50M, which is okay considering we're dealing with an image.
What i've tried : - I manage to get it down to 8M if I alloc/init the image and the filter in the filter method. That means the user has a good 3 second lag and still has memory warnings. At the end of the filter method i'd set everything to nil, and this is interseting : it doesn't go back down to 8M, sometimes it does, but most of the time it just stacks up to 40 to 75/80.
Using the default filters : It works fine but it takes about 10 to 15 seconds to filter an image. My user will be dead by the time his image is filtered, that can't be done.
Putting everything to nil. Still, at the next view, the Xcode memory show is of about 70, and that's just regular memory. Since i've managed to get memory warnings and crashes with only 8, i'm pretty sure the GPU memory is also in the red.
Reducing the size and compressing the image that i'm working with. Didn't change much. It reduces the memory usage (shown by Xcode) to about 45, but i still get the very same memory warnings and crashes after 3 or 4 filters.
Note : if I go very slowly, i really can do as many filters as i want and I only get memory warnings, but no crash.
I'm open to suggestions and questions, i'm quite out of ideas. It's really just applying the most basic filter to a really classic picture. I can show other bits of code if necessary.
Here's what I replied to the email you just sent me:
Well, one large problem here is that you’re going to and from UIImages. Every time you do that, you’re allocating a very large image and you’re not giving some of my framebuffer optimizations a chance to run. This will cause large memory spikes, particularly if you are setting these resulting images to a UIImageView and thus holding on to them.
Rather than creating a new GPUImagePicture every time you want to filter it, create the input GPUImagePicture once and chain filters off of it using addTarget:. Replace your UIImageView with a GPUImageView and target your filtered images at that. When you need to update your filter options, either swap out the filters or set the new options on your filters, then call -processImage on the original source image. This will cause all image processing to fully reside on the GPU (being a lot faster) and will be far more memory efficient.
I’d also recommend using -forceProcessingAtSize: or -forceProcessingAtSizeRespectingAspectRatio: on your first filter in your chain, and set that to the target pixel size of your view. There’s no sense in processing images at a higher resolution than you’ll display. This will also dramatically reduce filtering time and memory usage. When you need to capture your final image to disk, you can reset the image size to 0,0 to remove these constraints and get the full resolution image out.