Search code examples
iosobjective-ccameraavcapturesession

Capture image with bounds?


I am able to capture images from the iOS rear facing camera. Everything is working flawlessly except I want it to take the picture as per the bounds in my UIView.

My code is below:

- (void)viewDidLoad
{
    [super viewDidLoad];
    // Do any additional setup after loading the view.

    session = [[AVCaptureSession alloc] init];

    session.sessionPreset = AVCaptureSessionPresetMedium;

    captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
    captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    captureVideoPreviewLayer.frame = vImagePreview.bounds;
    [vImagePreview.layer addSublayer:captureVideoPreviewLayer];

    AVCaptureDevice *device = [self backFacingCameraIfAvailable];
    NSError *error = nil;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    if (!input) {
        // Handle the error appropriately.
        NSLog(@"ERROR: trying to open camera: %@", error);
    }
    [session addInput:input];

    stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
    [stillImageOutput setOutputSettings:outputSettings];
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in stillImageOutput.connections) {
        for (AVCaptureInputPort *port in [connection inputPorts]) {
            if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection) {
            break;
        }
    }

    [session startRunning];

    [session addOutput:stillImageOutput];
}

-(AVCaptureDevice *)backFacingCameraIfAvailable{

    NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    AVCaptureDevice *captureDevice = nil;
    for (AVCaptureDevice *device in videoDevices){
        if (device.position == AVCaptureDevicePositionBack){
            captureDevice = device;
            break;
        }
    }

    //  couldn't find one on the back, so just get the default video device.
    if (!captureDevice){
        captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    }
    return captureDevice;
}

And below is the code to capture the image:

- (IBAction)captureTask {
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in stillImageOutput.connections){
        for (AVCaptureInputPort *port in [connection inputPorts]){

            if ([[port mediaType] isEqual:AVMediaTypeVideo]){

                videoConnection = connection;
                break;
            }
        }
        if (videoConnection) {
            break;
        }
    }

    NSLog(@"about to request a capture from: %@", stillImageOutput);
    [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
     {
         CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
         if (exifAttachments) {
             // Do something with the attachments.
             NSLog(@"attachements: %@", exifAttachments);
         } else {
             NSLog(@"no attachments");
         }

         NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
         UIImage *image = [[UIImage alloc] initWithData:imageData];
         stillImage = image;

     }];
}

The issue i'm facing is that it's taking the picture, and saving to stillImage, however, the image is for the whole iPhone screen from what I can tell. It's not to the bounds of the UIView *vImagePreview I created. Is there a way to clip the bounds of the captured image??

[EDIT]

After reading the docs, I realized the image was proper resolution, as per here: session.sessionPreset = AVCaptureSessionPresetMedium;. Is there a way to make the image like a square? Like how Instagram makes their images? All of the session presets according to the docs are not at all squares :(

I tried with the below:

captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResize;

However, it only resizes the image to fit the current view, doesn't make a square image.


Solution

  • I understand your frustration, presets should be customizable or have more options! What I do with my images is crop them about the center, for which I wrote the following code:

    - (UIImage *)crop:(UIImage *)image from:(CGSize)src to:(CGSize)dst
    {
        CGPoint cropCenter = CGPointMake((src.width/2), (src.height/2));
        CGPoint cropStart = CGPointMake((cropCenter.x - (dst.width/2)), (cropCenter.y - (dst.height/2)));
        CGRect cropRect = CGRectMake(cropStart.x, cropStart.y, dst.width, dst.height);
        CGImageRef cropRef = CGImageCreateWithImageInRect(image.CGImage, cropRect);
        UIImage* cropImage = [UIImage imageWithCGImage:cropRef];
        CGImageRelease(cropRef);
    
        return cropImage;
    }
    

    Where src represents the original dimensions and dst represents the cropped dimensions; and image is of course the image you want cropped.