Search code examples
iphoneobjective-cavcapturesessionavcapture

Proper way to optimize my AVCaptureSession?


I got my AVCaptureSession to work and it duplicates the Camera.app UI almost perfectly, however, after a few seconds the application will crash and I just cannot find what I'm doing wrong. I really hope someone knows how to optimize this!

I AM using ARC; and again, the whole session runs fine but crashes after a little bit. The AVCaptureSession delegate method gets called what seems like EVERY second. If there's a way to call that method only when the user presses the "take picture" button, how can I do that while still maintaining the "live" preview layer?

Thanks in advance!

Setting up the session

NSError *error = nil;
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
[session addInput:input];

output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];

dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);

output.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
if(version >= 4.0 && version < 5.0) {
    output.minFrameDuration = CMTimeMake(1, 15);
}
output.alwaysDiscardsLateVideoFrames = YES;

previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer:previewLayer];
[self.view addSubview:camera_overlay];
[session startRunning];

AVCaptureSession Delegate that is being called:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    UIImage *capture_image = [self imageFromSampleBuffer:sampleBuffer];
    return capture_image;
}

Method that gets the UIImage from sample buffer

- (UIImage *)imageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer {

    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);
    CGContextRelease(context); 
    CGColorSpaceRelease(colorSpace);
    UIImage *image = [UIImage imageWithCGImage:quartzImage];
    CGImageRelease(quartzImage);

    return image;

}

Solution

  • Take a look at the AVCam Demo app from Apple for a complete example.

    The method

    - (void)captureOutput:(AVCaptureOutput *)captureOutput 
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
    fromConnection:(AVCaptureConnection *)connection {
    

    is called every time a camera frame is ready, and in your case it is called 15 times a second, or at least should be called 15 times, since you specified the frame rate as output.minFrameDuration = CMTimeMake(1, 15);

    From the code you provided the only reason I can think of is that you are not releasing the UIImage *capture_image

    You can use the XCode Instruments to profile your application and see why that happens: Instruments Guide

    The Leaks tool is your first stop in your case, there are many tutorials on the web for it, and here is one: Tracking iPhone Memory Leaks which was written a SO user OwenGross, if I'm not mistaken taken from here