Search code examples
iosswiftcameraavcapturesessionavcapturedevice

SWIFT 3: Capture photo with AVCapturePhotoOutput (Need another set of eyes to look over code, why isn't this working?)


I have a custom camera, with AVCapturePhotoCaptureDelegate added to the class, and the following code to capture a still image:

Outlets, Variables, and Constants

@IBOutlet weak var cameraPreview: UIView!
@IBOutlet wear var takePhotoPreview: UIImageView!

private var cameraView: AVCaptureVideoPreviewLayer!
private var camera: AVCaptureDevice!
private var cameraInput: AVCaptureDeviceInput!
private var cameraOutput: AVCapturePhotoOutput!
private var photoSampleBuffer: CMSampleBuffer?
private var previewPhotoSampleBuffer: CMSampleBuffer?
private var photoData: Data? = nil

private let cameraSession = AVCaptureSession()
private photoOutput = AVCapturePhotoOutput()

Setup Camera Session

private func createCamera() {
    cameraSession.beginConfiguration()
    cameraSession.sessionPreset = AVCaptureSessionPresetPhoto
    cameraSession.automaticallyConfiguresCaptureDeviceForWideColor = true

    // Add Camera Input
    if let defaultCamera = AVCaptureDeviceDiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: AVMediaTypeVideo, position: .back).devices {
        camera = defaultCamera.first
        do {
            let cameraInput = try AVCaptureDeviceInput(device: camera)
            if cameraSession.canAddInput(cameraInput) {
                cameraSession.addInput(cameraInput)
                print("Camera input added to the session")
            }
        } catch { print("Could not add camera input to the camera session") }
    }

    // Add Camera View Input
    if let cameraView = AVCaptureVideoPreviewLayer(session: cameraSession) {
        cameraView.frame = cameraPreview.bounds
        cameraView.videoGravity = AVLayerVideoGravityResizeAspectFill
        cameraView.cornerRadius = 12.0
        cameraPreview.layer.addSublayer(cameraView)
        print("Camera view created for the camera session")
    } else { print("Could not create camera preview") }

    // Add Photo Output
    let cameraPhotoOutput = AVCapturePhotoOutput()
    if cameraSession.canAddOutput(cameraPhotoOutput) {
        cameraSession.addOutput(cameraPhotoOutput)
        cameraPhotoOutput.isHighResolutionCaptureEnabled = true
        print("Camera output added to the camera session")
    } else {
        print("Could not add camera photo output to the camera session")
        cameraSession.commitConfiguration()
        return
    }

    cameraSession.commitConfiguration()

    cameraSession.startRunning()
}

CaptureButton

@IBOutlet weak var cameraShutter: UIButton!
@IBAction func cameraShutter(_ sender: UIButton) {
    let photoSettings = AVCapturePhotoSettings()
    photoSettings.flashMode = .on
    photoSettings.isHighResolutionPhotoEnabled = true
    photoSettings.isAutoStillImageStabilizationEnabled = true
    if photoSettings.availablePreviewPhotoPixelFormatTypes.count > 0 {
        photoSettings.previewPhotoFormat = [ kCVPixelBufferPixelFormatTypeKey as String : photoSettings.availablePreviewPhotoPixelFormatTypes.first!]
    }
    cameraPhotoOutput.capturePhoto(with: photoSettings, delegate: self)
}

iOS Observing Camera Function

func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
    if let photoSampleBuffer = photoSampleBuffer {
        photoData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: photoSampleBuffer, previewPhotoSampleBuffer: previewPhotoSampleBuffer)
        let photoDataProvider = CGDataProvider(data: photoData as! CFData)
        let cgImagePhotoRef = CGImage(jpegDataProviderSource: photoDataProvider!, decode: nil, shouldInterpolate: true, intent: .absoluteColorimetric)
        let newPhoto = UIImage(cgImage: cgImagePhotoRef!, scale: 1.0, orientation: UIImageOrientation.right)
        self.takePhotoPreview.image = newPhoto
        self.takePhotoPreview.isHidden = false
    }
        else {
        print("Error capturing photo: \(error)")
        return
    }
}

Alright, so here is the deal -- I put a breakpoint at cameraPhotoOutput.capturePhoto(with: photoSettings, delegate: self) and upon stepping into the line received the following error message:

Error Message

fatal error: unexpectedly found nil while unwrapping an Optional value [runtime details] fatal error: unexpectedly found nil while unwrapping an Optional value

The code above is directly from Apple's example doc "AVCam" along with input from SO Q&As (link, link., and others which repeated these answers). My end goal is to capture an image, and immediately push to image and user to a new ViewController to edit/post/save; however, I'm using a UIImageView currently just to confirm capture...which isn't working in the first place.

So SO, what is going on with this implementation??? It's been driving me nuts for days.

Swift 3, xCode 8


Solution

  • Alright, figured it out. El Tomato was on the right track with the problem child, but it wasn't the right prescription. My createCamera() function was set to private which of course makes the contents not visible outside its body. So while I was calling the correct AVCapturePhotoOutput(), the buffer feed didn't exist for the capturePhoto() call to execute...throwing the error described.

    So this means the line:

    cameraPhotoOutput.capturePhoto(with: photoSettings, delegate: self)

    was correct, but it was simply the incorrect setup into execution. To confirm proper execution I...

    • changed my private let photoOutput = AVCapturePhotoOutput() constant
    • to private let cameraPhotoOutput = AVCapturePhotoOutput()
    • and called that constant directly in private func createCamera()

    which immediately executed an image capture flawlessly.

    Also replacing cameraPhotoOutput, an AVCapturePhotoOutput(), with cameraOutput, an AVCapturePhotoOutput!, was tried and simply reproduced the error.

    If you are interested: the cgImage creation process stayed the same in the func capture(_ : capture... function. Within its bounds, I also determined the camera device's position, changed the image's orientation if front camera, and dispatched on the main queue the photo over to a var photoContent: UIImage? variable on the ReviewViewController.

    Hope my mental error helps someone else :-)