Search code examples
iosobjective-cmemory-managementinstrumentsavcapturesession

AVCaptureSession Abandoned memory - allocations - instruments


I use default AVCaptureSession to capture the camera view.
Everything is working ok, i don't have any leaks, but when I use Allocations to find Abandoned memory after launching and closing AVCaptureDevice it shows me aproximately 230 objects, that are still live.

Here is my code:

Controller.h:

@interface Controller : UIViewController <AVCaptureVideoDataOutputSampleBufferDelegate> {
AVCaptureSession *captureSession;
AVCaptureDevice *device;

IBOutlet UIView *previewLayer;
}
@property (nonatomic, retain) AVCaptureSession *captureSession;
@property (nonatomic, retain) UIView *previewLayer;

- (void)setupCaptureSession;
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
       fromConnection:(AVCaptureConnection *)connection;
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer;

Controller.m:

- (void)setupCaptureSession {       
    NSError *error = nil;

    [self setCaptureSession: [[AVCaptureSession alloc] init]];

    self.captureSession.sessionPreset = AVCaptureSessionPresetMedium;

    device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [device lockForConfiguration:&error]) {
        [device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
        [device unlockForConfiguration];
    }

    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device 
                                                                        error:&error];
    if (!input) {
        // TODO: Obsługa błędu, gdy nie uda się utworzyć wejścia
    }
    [[self captureSession] addInput:input];

    AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
    [[self captureSession] addOutput:output];

    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);

    output.videoSettings = 
    [NSDictionary dictionaryWithObject:
     [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] 
                                forKey:(id)kCVPixelBufferPixelFormatTypeKey];


    output.minFrameDuration = CMTimeMake(1, 15);

    [[self captureSession] startRunning];

    AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
    captureVideoPreviewLayer.frame = previewLayer.bounds;
    [previewLayer.layer insertSublayer:captureVideoPreviewLayer atIndex:0];
    [previewLayer setHidden:NO];
}

// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
       fromConnection:(AVCaptureConnection *)connection { 
    if (mutex && ![device isAdjustingFocus] && ![device isAdjustingExposure] && ![device isAdjustingWhiteBalance]) {
        // Create a UIImage from the sample buffer data
        mutex = NO;
        UIImage *image = [self imageFromSampleBuffer:sampleBuffer];

        image = [Tools rotateImage:image andRotateAngle:UIImageOrientationUp];  

        CGRect rect;
        rect.size.width = 210;
        rect.size.height = 50;
        rect.origin.x = 75;
        rect.origin.y = 175;

        UIImage *croppedImage = [image resizedImage:image.size interpolationQuality:kCGInterpolationHigh];
        croppedImage = [croppedImage croppedImage:rect];

        croppedImage = [self processImage:croppedImage];
        [NSThread detachNewThreadSelector:@selector(threadedReadAndProcessImage:) toTarget:self withObject:croppedImage];
    }
}

// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer {
    // Get a CMSampleBuffer's Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 

    // Get the number of bytes per row for the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    // Create a bitmap graphics context with the sample buffer data
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, 
                                                 bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    CGContextRelease(context); 
    CGColorSpaceRelease(colorSpace);

    UIImage *image = [UIImage imageWithCGImage:quartzImage];

    CGImageRelease(quartzImage);

    return (image);
}

And i clean everything with this code:

- (void)cancelTapped {
    [[self captureSession] stopRunning], self.captureSession = nil; 

    for (UIView *view in self.previewLayer.subviews) {
        [view removeFromSuperview];
    }

    [self dismissModalViewControllerAnimated:YES];
}

- (void)dealloc {
    [super dealloc];

    [captureSession release];
    [device release];
    [previewLayer release];
}

And instruments show me something like this: https://i.sstatic.net/NBWgZ.png

https://i.sstatic.net/1GB6C.png

Any ideas what I am doing wrong?


Solution

  • - (void)setupCaptureSession {       
       NSError *error = nil;
    
       [self setCaptureSession: [[AVCaptureSession alloc] init]];
       ...
    

    That leaks the capture-session, which will keep all inputs and outputs and all their little internal helpers alive.

    Two options:

    AVCaptureSession *session = [[AVCaptureSession alloc] init];
    self.captureSession = session;
    [session release], session = nil;
    // or:
    self.captureSession = [[[AVCaptureSession alloc] init] autorelease];