Search code examples
swiftuiimageavcapturesession

Swift - Getting UIImages from Camera (AVCaptureSession)


Intro and background:

I have been working on a project for sometime that lets the user do some custom manipulations from their camera (a live feed)

At the moment, I start the capture session in the following way:

var session: AVCaptureSession?
var stillImageOutput: AVCaptureStillImageOutput?
var videoPreviewLayer: AVCaptureVideoPreviewLayer?

override func viewDidAppear(_ animated: Bool) {
    super.viewDidAppear(animated)
    videoPreviewLayer!.frame = CameraView.bounds
}

override func viewWillAppear(_ animated: Bool) {
    super.viewWillAppear(animated)
    session = AVCaptureSession()
    session!.sessionPreset = AVCaptureSessionPresetPhoto
    let backCamera = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
    var error: NSError?
    var input: AVCaptureDeviceInput!
    do {
        input = try AVCaptureDeviceInput(device: backCamera)
    } catch let error1 as NSError {
        error = error1
        input = nil
    }
    if error == nil && session!.canAddInput(input) {
        session!.addInput(input)
        stillImageOutput = AVCaptureStillImageOutput()
        stillImageOutput?.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
        if session!.canAddOutput(stillImageOutput) {
            session!.addOutput(stillImageOutput)
            videoPreviewLayer = AVCaptureVideoPreviewLayer(session: session)
            videoPreviewLayer!.videoGravity = AVLayerVideoGravityResizeAspect
            videoPreviewLayer!.connection?.videoOrientation = AVCaptureVideoOrientation.portrait
            CameraView.layer.addSublayer(videoPreviewLayer!)
            session!.startRunning()
        }
    }
}

where CameraView is the UIView of my viewcontroller. I now have a function called: singleTapped() that I want to get every frame of the capture, process it, then put into the CameraView frame (Perhaps I should be using a UIImageView instead?)...

Research:

I have looked here and here, as well as many others for getting the frames of the camera, yet these don't necessarily conclude where I need. What's interesting is in the first link I provided: In their answer they have:

self.stillImageOutput.captureStillImageAsynchronouslyFromConnection(self.stillImageOutput.connectionWithMediaType(AVMediaTypeVideo)) { (buffer:CMSampleBuffer!, error:NSError!) -> Void in
    var image = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(buffer)
    var data_image = UIImage(data: image) //THEY EXTRACTED A UIIMAGE HERE
    self.imageView.image = data_image
}

which does indeed get a UIImage from the camera, but is this a viable method for 30fps?

Rational and Constraints:

The reason for why I need a UIImage is because I am utilizing a library someone else wrote for transforming a UIImage in a custom way quickly. I want to present this transformation to the user "live".

In conclusion

Please let me know if I am missing something, or if I should reword something. As said above this is my first post, so I am not quite strong with SO nuances. Thanks, and cheers


Solution

  • You should maybe try reconsider using AVCaptureSession. For what you are doing (I assume) you should try using OpenCV. Its a great utility for image manipulations, especially if you are doing so at 30/60fps* (The actual frame rate after processing might, and I guarantee will, be less). Depending on what this manipulation is you have been given, you can easily port that over into XCode using bridging headers or converting everything entirely to C++ for use with OpenCV.

    With OpenCV you can call the camera from built-in functions and that can save you lots of processing time and therefore runtime. For example, take a look at this.

    I have used OpenCV for similar situations to which you just described, and I think you could benefit. Swift is nice, but sometimes handling certain things are better through other means...