Search code examples
iosobjective-cgpuimage

iOS - Reduce GPUImage RAM usage


I am creating a filter with GPUImage. The image displays fine. I also have a UISlider that the user can slide to change alpha to the filter.

Here is how I setup my filter:

-(void)setupFirstFilter
{
    imageWithOpacity = [ImageWithAlpha imageByApplyingAlpha:0.43f image:scaledImage];
    pictureWithOpacity = [[GPUImagePicture alloc] initWithCGImage:[imageWithOpacity CGImage] smoothlyScaleOutput:YES];

    originalPicture = [[GPUImagePicture alloc] initWithCGImage:[scaledImage CGImage] smoothlyScaleOutput:YES];

    multiplyBlender = [[GPUImageMultiplyBlendFilter alloc] init];
    [originalPicture addTarget:multiplyBlender];
    [pictureWithOpacity addTarget:multiplyBlender];

    UIImage *pinkImage = [ImageFromColor imageFromColor:[UIColor colorWithRed:255.0f/255.0f green:185.0f/255.0f blue:200.0f/255.0f alpha:0.21f]];
    pinkPicture = [[GPUImagePicture alloc] initWithCGImage:[pinkImage CGImage] smoothlyScaleOutput:YES];

    overlayBlender = [[GPUImageOverlayBlendFilter alloc] init];
    [multiplyBlender addTarget:overlayBlender];
    [pinkPicture addTarget:overlayBlender];

    UIImage *blueImage = [ImageFromColor imageFromColor:[UIColor colorWithRed:185.0f/255.0f green:227.0f/255.0f blue:255.0f/255.0f alpha:0.21f]];
    bluePicture = [[GPUImagePicture alloc] initWithCGImage:[blueImage CGImage] smoothlyScaleOutput:YES];

    secondOverlayBlend = [[GPUImageOverlayBlendFilter alloc] init];
    [overlayBlender addTarget:secondOverlayBlend];
    [bluePicture addTarget:secondOverlayBlend];

    [secondOverlayBlend addTarget:self.editImageView];

    [originalPicture processImage];
    [pictureWithOpacity processImage];
    [pinkPicture processImage];
    [bluePicture processImage];
}

And when the slider is changed this gets called:

-(void)sliderChanged:(id)sender
{
    UISlider *slider = (UISlider*)sender;
    double value = slider.value;

    [originalPicture addTarget:multiplyBlender];
    [pictureWithOpacity addTarget:multiplyBlender];

    UIImage *pinkImage = [ImageFromColor imageFromColor:[UIColor colorWithRed:255.0f/255.0f green:185.0f/255.0f blue:200.0f/255.0f alpha:value]];
    pinkPicture = [[GPUImagePicture alloc] initWithCGImage:[pinkImage CGImage] smoothlyScaleOutput:NO];

    [multiplyBlender addTarget:overlayBlender];
    [pinkPicture addTarget:overlayBlender];

    [overlayBlender addTarget:secondOverlayBlend];
    [bluePicture addTarget:secondOverlayBlend];

    [secondOverlayBlend addTarget:self.editImageView];

    [originalPicture processImage];
    [pictureWithOpacity processImage];
    [pinkPicture processImage];
    [bluePicture processImage];
}

The code above works fine. But the slide is slow and this is taking up to 170 MB or RAM. Before pressing to use filter it is around 30 MB RAM. How can I reduce the RAM by doing this filter?

I already reduce the image size.

Any help is greatly appreciated.


Solution

  • My first suggestion is to get rid of the single-color UIImages and their corresponding GPUImagePicture instances. Instead, use a GPUImageSolidColorGenerator, which does this solid-color generation entirely on the GPU. Make it output a small image size and that will be scaled up to fit your larger image. That will save on the memory required for your UIImages and avoid a costly draw / upload process.

    Ultimately, however, I'd recommend making your own custom filter rather than running multiple blend steps using multiple input images. All that you're doing is applying a color modification to your source image, which can be done inside a single custom filter.

    You could pass in your colors to a shader that applies two mix() operations, one for each color. The strength of each mix value would correspond to the alpha you're using in the above for each solid color. That would reduce this down to one input image and one processing step, rather than three input images and two steps. It would be faster and use significantly less memory.