Search code examples
objective-cuiviewavfoundationavcapturesessionavcapturedevice

Display camera feed in UIView in iOS


I have an iOS app with a simple UIView placed in the view controller. I am trying to show the camera feed of the front facing camera, in the UIView. I am not trying to take a picture or record a video, I simply want to show the live feed in a UIView.

I have tried to implement AVCaptureVideoPreviewLayer, however the feed I get is blank. Nothing seems to happen. Here is my code:

AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];
    
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    
NSError *error = nil;
AVCaptureDeviceInput *input;
    
    @try {
        
        input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
        
    } @catch (NSException *exception) {
        NSLog(@"Error; %@", error);
    } @finally {
        
        if (error == nil) {
            
            if ([session canAddInput:input]) {
                
                [session addInput:input];
                
                AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
                
                [stillImageOutput setOutputSettings:@{AVVideoCodecKey : AVVideoCodecJPEG}];
                
                if ([session canAddOutput:stillImageOutput]) {
                    
                    [session setSessionPreset:AVCaptureSessionPresetHigh];
                    [session addOutput:stillImageOutput];
                    
                    AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
                    
                    [previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
                    [previewLayer.connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
                    
                    [backgroundStreamView.layer addSublayer:previewLayer];
                    
                    [session startRunning];
                    
                    NSLog(@"session running");
                    
                } else {
                    NSLog(@"cannot add output");
                }
                
            } else {
                NSLog(@"cannot add inout");
            }
            
        } else {
            NSLog(@"general error: %@", error);
        }
    }

The session runs perfectly fine, however no video feed is shown. What am I doing wrong?


Solution

  • Managed to fix it myself, turned out to be a fairly simple issue - I didn't specify the frame size of the AVCapturePreviewLayer and as a result it was not appearing (presumably because it defaults to a frame size of zero).

    To fix this I set the frame to match the frame of my custom UIView:

    [previewLayer setFrame:backgroundStreamView.bounds];
    

    Deprecation code fix

    AVCaptureStillImageOutput is also deprecated, so to fix that, I replaced it with the AVCapturePhotoOutput class. Thus the code changed from:

    AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];                
    [stillImageOutput setOutputSettings:@{AVVideoCodecKey : AVVideoCodecJPEG}]
    

    to the following:

    AVCapturePhotoOutput *stillImageOutput = [[AVCapturePhotoOutput alloc] init];