Search code examples
iosswift3cameraface-detectiongoogle-ios-vision

Google Face Detection crashing when converting to image and trying to detect face


I am creating a custom camera with filters. When I add the following line it crashes without showing any exception.

//Setting video output

func setupBuffer() {
    videoBuffer = AVCaptureVideoDataOutput()
    videoBuffer?.alwaysDiscardsLateVideoFrames = true
    videoBuffer?.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString): NSNumber(value: kCVPixelFormatType_32RGBA)]
    videoBuffer?.setSampleBufferDelegate(self, queue: DispatchQueue.main)
    captureSession?.addOutput(videoBuffer)
}


public func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {

    if connection.videoOrientation != .portrait {
        connection.videoOrientation = .portrait
    }
        guard let image = GMVUtility.sampleBufferTo32RGBA(sampleBuffer) else {
            print("No Image 😂")
            return
        }

    pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
    ciImage = CIImage(cvImageBuffer: pixelBuffer!, options: CMCopyDictionaryOfAttachments(kCFAllocatorDefault, sampleBuffer, kCMAttachmentMode_ShouldPropagate)as! [String : Any]?)

    CameraView.filter = CIFilter(name: "CIPhotoEffectProcess")
    CameraView.filter?.setValue(ciImage, forKey: kCIInputImageKey)
    let cgimg = CameraView.context.createCGImage(CameraView.filter!.outputImage!, from: ciImage.extent)

    DispatchQueue.main.async  {
        self.preview.image = UIImage(cgImage: cgimg!)
    }
}

But it's crashing on -

  guard let image = GMVUtility.sampleBufferTo32RGBA(sampleBuffer) else {
                print("No Image 😂")
                return
            }

When I pass image which is created from CIImage, it doesn't recognize the face in the image. Complete code file is https://www.dropbox.com/s/y1ewd1sh18h3ezj/CameraView.swift.zip?dl=0

enter image description here


Solution

  • 1) Create separate queue for buffer.

     fileprivate var videoDataOutputQueue = DispatchQueue(label: "VideoDataOutputQueue")
    

    2) Setup buffer with this

            let videoBuffer = AVCaptureVideoDataOutput()
            videoBuffer?.alwaysDiscardsLateVideoFrames = true
            videoBuffer?.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString): NSNumber(value: kCVPixelFormatType_32BGRA)]
            videoBuffer?.setSampleBufferDelegate(self, queue: videoDataOutputQueue ) //
            captureSession?.addOutput(videoBuffer)