I am taking multiple pictures continuously and processing them using GPUImage framework. I have a helper class essentially to perform the GPUImageSubtractBlendFilter. Here is what I do:
#import "ImageProcessor.h"
@interface ImageProcessor ()
@end
@implementation ImageProcessor
GPUImageSubtractBlendFilter *subFilter;
-(id)init {
self = [super init];
subFilter = [[GPUImageSubtractBlendFilter alloc] init];
return self;
}
-(UIImage*)flashSubtract:(UIImage*) image1 : (UIImage*) image2{
UIImage *processedImage;
// @autoreleasepool {
//CAUSING MEMORY ISSUE
GPUImagePicture *img1 = [[GPUImagePicture alloc] initWithImage:image1];
GPUImagePicture *img2 = [[GPUImagePicture alloc] initWithImage:image2];
//MEMORY ISSUE END
[img1 addTarget:subFilter];
[img2 addTarget:subFilter];
[img1 processImage];
[img2 processImage];
[subFilter useNextFrameForImageCapture];
processedImage = [subFilter imageFromCurrentFramebuffer];
// }
//consider modifications to filter possibly?
return processedImage;
}
Memory is continuously growing and doesn't deallocate even with ARC enabled. I debugged it and narrowed it down to these two allocations that are the heart of the cause:
img1 = [[GPUImagePicture alloc] initWithImage:[imagesArray objectAtIndex:1]];
img2 = [[GPUImagePicture alloc] initWithImage:[imagesArray objectAtIndex:0]];
Am I missing anything here or is there anything I should do better to not continuously allocate GPUImagePicture varialbles?
Here is where the code originates from:
-(void)burstModeCapture : (AVCaptureConnection *) videoConnection : (int) i{//start capturing picture s rapidly and cache them in ram
dispatch_group_t group = dispatch_group_create();
dispatch_group_enter(group);
NSLog(@"time entering: %d", i);
[photoOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
if(error)
NSLog(@"%s",[[error localizedDescription] UTF8String]);
CVImageBufferRef cameraFrame = CMSampleBufferGetImageBuffer(imageSampleBuffer);
CVPixelBufferLockBaseAddress(cameraFrame, 0);
Byte *rawImageBytes = CVPixelBufferGetBaseAddress(cameraFrame);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(cameraFrame);
size_t width = CVPixelBufferGetWidth(cameraFrame);
size_t height = CVPixelBufferGetHeight(cameraFrame);
NSData *dataForRawBytes = [NSData dataWithBytes:rawImageBytes length:bytesPerRow * CVPixelBufferGetHeight(cameraFrame)];
// Do whatever with your bytes
// create suitable color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
//Create suitable context (suitable for camera output setting kCVPixelFormatType_32BGRA)
CGContextRef newContext = CGBitmapContextCreate(rawImageBytes, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CVPixelBufferUnlockBaseAddress(cameraFrame, 0);
// release color space
CGColorSpaceRelease(colorSpace);
//Create a CGImageRef from the CVImageBufferRef
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
UIImage *FinalImage = [[UIImage alloc] initWithCGImage:newImage];
[imagesArray addObject:FinalImage];//append image to array
dispatch_group_leave(group);
}];
dispatch_group_notify(group, dispatch_get_main_queue(), ^{//execute function recursively to shoot n photos
//base case to stop shooting pictures
shootCounter--;
if (shootCounter <= 0) {
[flash turnOffFlash];
shootCounter = NUMSHOTS;
UIImage *output = [self processImages]; //THIS IS WHERE MEMORY STARTS ACCUMULATING
[self updateUIWithOutput:output];
NSLog(@"Done shooting!");
}
else {
[NSThread sleepForTimeInterval: 0.1];
[self burstModeCapture:videoConnection : shootCounter];
}
});
}
I run this function recursively twice to capture pairs of images. [imageProcessor flashSubtract] is where the problem exists.
You are missing CGContextRelease(newContext);
after your CGImageRef newImage = CGBitmapContextCreateImage(newContext);
line. This might cause your memory leak.