Search code examples
iosmemory-leaksuiimagecmsamplebufferref

iOS memory building up while creating UIImage from CMSampleBufferRef


I'm creating UIImage objects from CMSampleBufferRef's. I'm doing this in a separate queue (in background) so I'm including the processing in an @autorealease pool. The problem is that memory is building up without any leak notification. Bellow is the method I'm using:

- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
    @autoreleasepool {
        // Get a CMSampleBuffer's Core Video image buffer for the media data
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        // Lock the base address of the pixel buffer
        CVPixelBufferLockBaseAddress(imageBuffer, 0);

        // Get the number of bytes per row for the pixel buffer
        void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

       // Get the number of bytes per row for the pixel buffer
       size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
       // Get the pixel buffer width and height
       size_t width = CVPixelBufferGetWidth(imageBuffer);
       size_t height = CVPixelBufferGetHeight(imageBuffer);

       // Create a device-dependent RGB color space
       CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

       // Create a bitmap graphics context with the sample buffer data
       CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                                 bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
      // Create a Quartz image from the pixel data in the bitmap graphics context
       CGImageRef quartzImage = CGBitmapContextCreateImage(context);
       // Unlock the pixel buffer
       CVPixelBufferUnlockBaseAddress(imageBuffer,0);

       // Free up the context and color space
       CGContextRelease(context);
       CGColorSpaceRelease(colorSpace);

       // Create an image object from the Quartz image
       UIImage *image = [[UIImage imageWithCGImage:quartzImage] retain];

       // Release the Quartz image
       CGImageRelease(quartzImage);

       return (image);
   }
}

And this is how I'm using it:

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection {

    CFRetain(sampleBuffer);
    dispatch_async(movieWritingQueue, ^{
    @autoreleasepool {

        if (self.returnCapturedImages && captureOutput != audioOutput) {

            UIImage *capturedImage = [self imageFromSampleBuffer: sampleBuffer];

            dispatch_async(callbackQueue, ^{

                @autoreleasepool {

                    if (self.delegate && [self.delegate respondsToSelector: @selector(recorderCapturedImage:)]) {
                        [self.delegate recorderCapturedImage: capturedImage];
                    }

                    [capturedImage release];
                }
            });
        }
        CFRelease(sampleBuffer);
    }
});

Solution

  • I found a temporary solution. I'm doing the same operations but on the main queue. This is not elegant or efficient at all, but at least the memory doesn't build up anymore.

    I'm wondering if this is an iOS bug...?

    UPDATE: This is how I'm processing the CMSampleBuffers on the main thread:

    [[NSOperationQueue mainQueue] addOperationWithBlock:^ {
    
        CGImageRef cgImage = [self cgImageFromSampleBuffer:sampleBuffer];
        UIImage *capturedImage =     [UIImage imageWithCGImage: cgImage ];
    
        //do something with the image - I suggest in a background thread
        dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
           // do something with the image
        });
    
        CGImageRelease( cgImage );
        CFRelease(sampleBuffer);
    }];
    
    - (CGImageRef) cgImageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
    {
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        CVPixelBufferLockBaseAddress(imageBuffer,0);        // Lock the image buffer
    
        uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);   // Get information of the image
        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
        size_t width = CVPixelBufferGetWidth(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    
        CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
        CGImageRef newImage = CGBitmapContextCreateImage(newContext);
        CGContextRelease(newContext);
    
        CGColorSpaceRelease(colorSpace);
        CVPixelBufferUnlockBaseAddress(imageBuffer,0);
        /* CVBufferRelease(imageBuffer); */  // do not call this!
    
        return newImage;
    }