Search code examples
iosios7avfoundation

How To Use AVCaptureStillImageOutput To Take Picture


I have a preview layer that is pulling from the camera and working as it should. I would like to be able to take a picture when I press a button. I have inited the AVCaptureStillImageOutput like this:

AVCaptureStillImageOutput *avCaptureImg = [[AVCaptureStillImageOutput alloc] init];

Then I am trying to take a picture using this object:

[avCaptureImg captureStillImageAsynchronouslyFromConnection:(AVCaptureConnection *) completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {  }];

I need help on how to take a picture and save it in a variable. Thanks


Solution

  • You need to be sure to define a AVCaptureVideoPreviewLayer & add it to a view layer :

    AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
    [self.view.layer addSublayer:captureVideoPreviewLayer];
    

    This will be connected to your AVCaptureDeviceInput

    Here's the full solution :

    /////////////////////////////////////////////////
    ////
    //// Utility to find front camera
    ////
    /////////////////////////////////////////////////
    -(AVCaptureDevice *) frontFacingCameraIfAvailable{
    
        NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
        AVCaptureDevice *captureDevice = nil;
    
       for (AVCaptureDevice *device in videoDevices){
    
            if (device.position == AVCaptureDevicePositionFront){
    
                captureDevice = device;
                break;
            }
        }
    
        //  couldn't find one on the front, so just get the default video device.
        if (!captureDevice){
    
            captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
        }
    
        return captureDevice;
    }
    
    /////////////////////////////////////////////////
    ////
    //// Setup Session, attach Video Preview Layer
    //// and Capture Device, start running session
    ////
    /////////////////////////////////////////////////
    -(void) setupCaptureSession {
        AVCaptureSession *session = [[AVCaptureSession alloc] init];
        session.sessionPreset = AVCaptureSessionPresetMedium;
    
        AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer    alloc] initWithSession:session];
        [self.view.layer addSublayer:captureVideoPreviewLayer];
    
        NSError *error = nil;
        AVCaptureDevice *device = [self frontFacingCameraIfAvailable];
        AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
        if (!input) {
            // Handle the error appropriately.
            NSLog(@"ERROR: trying to open camera: %@", error);
        }
        [session addInput:input];
    
        self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
        NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
        [self.stillImageOutput setOutputSettings:outputSettings];
    
        [session addOutput:self.stillImageOutput];
    
        [session startRunning];
    }
    
    
    /////////////////////////////////////////////////
    ////
    //// Method to capture Still Image from 
    //// Video Preview Layer
    ////
    /////////////////////////////////////////////////
    -(void) captureNow {
        AVCaptureConnection *videoConnection = nil;
        for (AVCaptureConnection *connection in self.stillImageOutput.connections) {
            for (AVCaptureInputPort *port in [connection inputPorts]) {
                if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
                    videoConnection = connection;
                    break;
                }
            }
            if (videoConnection) { break; }
        }
    
        NSLog(@"about to request a capture from: %@", self.stillImageOutput);
        __weak typeof(self) weakSelf = self;
        [self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
    
             NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
             UIImage *image = [[UIImage alloc] initWithData:imageData];
    
             [weakSelf displayImage:image];
         }];
    }