Search code examples
iosswiftxcodeuiimageavfoundation

How to take UIImage of AVCaptureVideoPreviewLayer instead of AVCapturePhotoOutput capture


I want to "stream" the preview layer to my server, however, I only want specific frames to be sent. Basically, I want to take a snapshot of the AVCaptureVideoPreviewLayer, scale it down to 28*28, turn it into an intensity array, and send it to my socket layer where my python backend handles the rest.

Problem here is that AVCapturePhotoOutput's capture function is insanely slow. I can't repeatedly call the function. Not to mention it always makes a camera shutter sound haha.

The other problem is that taking a snapshot of AVCaptureVideoPreviewLayer is really difficult. Using UIGraphicsBeginImageContext almost always returns a blank/clear image.

Help a brother out, thanks!


Solution

  • Basically instead of using AVCaptureVideoPreviewLayer for grabbing frames you should use AVCaptureVideoDataOutputSampleBufferDelegate. Here is example:

    import Foundation
    import UIKit
    import AVFoundation
    
    protocol CaptureManagerDelegate: class {
        func processCapturedImage(image: UIImage)
    }
    
    class CaptureManager: NSObject {
        internal static let shared = CaptureManager()
        weak var delegate: CaptureManagerDelegate?
        var session: AVCaptureSession?
    
        override init() {
            super.init()
            session = AVCaptureSession()
    
            //setup input
            let device =  AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
            let input = try! AVCaptureDeviceInput(device: device)
            session?.addInput(input)
    
            //setup output
            let output = AVCaptureVideoDataOutput()
            output.videoSettings = [kCVPixelBufferPixelFormatTypeKey as AnyHashable: kCVPixelFormatType_32BGRA]
            output.setSampleBufferDelegate(self, queue: DispatchQueue.main)
            session?.addOutput(output)
        }
    
        func statSession() {
            session?.startRunning()
        }
    
        func stopSession() {
            session?.stopRunning()
        }
    
        func getImageFromSampleBuffer(sampleBuffer: CMSampleBuffer) ->UIImage? {
            guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
                return nil
            }
            CVPixelBufferLockBaseAddress(pixelBuffer, .readOnly)
            let baseAddress = CVPixelBufferGetBaseAddress(pixelBuffer)
            let width = CVPixelBufferGetWidth(pixelBuffer)
            let height = CVPixelBufferGetHeight(pixelBuffer)
            let bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer)
            let colorSpace = CGColorSpaceCreateDeviceRGB()
            let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedFirst.rawValue | CGBitmapInfo.byteOrder32Little.rawValue)
            guard let context = CGContext(data: baseAddress, width: width, height: height, bitsPerComponent: 8, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo: bitmapInfo.rawValue) else {
                return nil
            }
            guard let cgImage = context.makeImage() else {
                return nil
            }
            let image = UIImage(cgImage: cgImage, scale: 1, orientation:.right)
            CVPixelBufferUnlockBaseAddress(pixelBuffer, .readOnly)
            return image
        }
    }
    
    extension CaptureManager: AVCaptureVideoDataOutputSampleBufferDelegate {
        func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
            guard let outputImage = getImageFromSampleBuffer(sampleBuffer: sampleBuffer) else {
                return
            }
            delegate?.processCapturedImage(image: outputImage)
        }
    }
    

    Update: To process images you should implement a processCapturedImage method of the CaptureManagerDelegate protocol in any other class where you want, like:

    import UIKit
    
    class ViewController: UIViewController {
        @IBOutlet weak var imageView: UIImageView!
        override func viewDidLoad() {
            super.viewDidLoad()
            CaptureManager.shared.statSession()
            CaptureManager.shared.delegate = self
        }
    }
    
    extension ViewController: CaptureManagerDelegate {
        func processCapturedImage(image: UIImage) {
            self.imageView.image = image
        }
    }