Search code examples
iosswiftavfoundationgrand-central-dispatchavcapturesession

How do I stop camera lag in a collectionView cell?


I have a collectionView which has cells acting as screens. When I swipe to the camera cell after opening the app there is a lag for a second and then afterwards the swiping is smooth back and forth below is a video of this lag. Is there anyway to prevent this maybe start the capture session in the background before the cell is reached? Thank you for your help.

enter image description here

Code for Camera Cell

import UIKit
import AVFoundation


class MainCameraCollectionViewCell: UICollectionViewCell {

    var captureSession = AVCaptureSession()
    private var sessionQueue: DispatchQueue!
    var captureConnection = AVCaptureConnection()

    var backCamera: AVCaptureDevice?
    var frontCamera: AVCaptureDevice?
    var currentCamera: AVCaptureDevice?

    var photoOutPut: AVCapturePhotoOutput?

    var cameraPreviewLayer: AVCaptureVideoPreviewLayer?

    var image: UIImage?

    var usingFrontCamera = false

    override func awakeFromNib() {
        super.awakeFromNib()
        setupCaptureSession()
        setupDevice()
        setupInput()
        self.setupPreviewLayer()
        startRunningCaptureSession
    }

    func setupCaptureSession(){
        captureSession.sessionPreset = AVCaptureSession.Preset.photo
        sessionQueue = DispatchQueue(label: "session queue")
    }

    func setupDevice(usingFrontCamera:Bool = false){
        DispatchQueue.main.async {
            //sessionQueue.async {
            let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [AVCaptureDevice.DeviceType.builtInWideAngleCamera], mediaType: AVMediaType.video, position: AVCaptureDevice.Position.unspecified)
            let devices = deviceDiscoverySession.devices

            for device in devices{
                if usingFrontCamera && device.position == AVCaptureDevice.Position.front {
                    //backCamera = device
                    self.currentCamera = device
                } else if device.position == AVCaptureDevice.Position.back {
                    //frontCamera = device
                    self.currentCamera = device
                }
            }
        }
    }
    func setupInput() {
        DispatchQueue.main.async {
            do {
                let captureDeviceInput = try AVCaptureDeviceInput(device: self.currentCamera!)
                if self.captureSession.canAddInput(captureDeviceInput) {
                    self.captureSession.addInput(captureDeviceInput)
                }
                self.photoOutPut = AVCapturePhotoOutput()
                self.photoOutPut?.setPreparedPhotoSettingsArray([AVCapturePhotoSettings(format:[AVVideoCodecKey: AVVideoCodecType.jpeg])], completionHandler: nil)
                if self.captureSession.canAddOutput(self.photoOutPut!) {
                    self.captureSession.addOutput(self.photoOutPut!)
                }
            } catch {
                print(error)
            }
        }
    }
    func setupPreviewLayer(){
        cameraPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        cameraPreviewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
        cameraPreviewLayer?.connection?.videoOrientation = AVCaptureVideoOrientation.portrait
        cameraPreviewLayer?.frame = CGRect(x: 0, y: 0, width: UIScreen.main.bounds.width, height: UIScreen.main.bounds.height)
        self.layer.insertSublayer(cameraPreviewLayer!, at: 0)
    }

    func startRunningCaptureSession(){
        captureSession.startRunning()
    }

    @IBAction func cameraButton_Touched(_ sender: Any) {
        let settings = AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg])
        //
        settings.isAutoStillImageStabilizationEnabled = true
        if let photoOutputConnection = self.photoOutPut?.connection(with: .video){
            photoOutputConnection.videoOrientation = (cameraPreviewLayer?.connection?.videoOrientation)!
        }
    }



    @IBAction func Flip_camera(_ sender: UIButton?) {
        print("Flip Touched")

        self.captureSession.beginConfiguration()
         if let inputs = self.captureSession.inputs as? [AVCaptureDeviceInput] {
            for input in inputs {
                self.captureSession.removeInput(input)
                print("input removed")
            }
            //This seemed to have fixed it
            for output in self.captureSession.outputs{
                captureSession.removeOutput(output)
                print("out put removed")
            }
        }


        self.usingFrontCamera = !self.usingFrontCamera
        self.setupCaptureSession()
        self.setupDevice(usingFrontCamera: self.usingFrontCamera)
        self.setupInput()
        self.captureSession.commitConfiguration()
        self.startRunningCaptureSession()
    }

}

Solution

  • Initializing the camera takes time. Once your app requests use of the camera, supporting software has to be initialized in the background, which isn't really possible to speed up.

    I would recommend placing anything related to AVFoundation in a background thread and initialize it after your app loads. That way, the camera will be ready for the user once he/she is ready to swipe to the camera cell. If you don't want to preload, you could at least still place the AVFoundation in the background and utilize some kind of activity indicator to show the user that something is loading instead of just allowing your main thread to be blocked while the camera is booting up.