Search code examples
iosswiftswift3avcapturesession

Taking photo with custom camera Swift 3


in Swift 2.3 I used this code to take a picture in custom camera:

 func didPressTakePhoto(){

        if let videoConnection = stillImageOutput!.connection(withMediaType: AVMediaTypeVideo) {

            stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in
                if sampleBuffer != nil {
                    let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
                    let dataProvider = CGDataProviderCreateWithCFData(imageData)
                    let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
                    let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)


                    self.captureImageView.image = image
                }
            })

    }
}

But his line: stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in

Shows this error:

Value of type 'AVCapturePhotoOutput' has no member 'captureStillImageAsynchronouslyFromConnection'

I tried solving my problem but I always get more and more errors so that is why I post my original code.

Does anybody know how to make my code work again?

Thank you.


Solution

  • Thanks to Sharpkits I found my solution (This code works for me):

    func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
    
            if let error = error {
                print(error.localizedDescription)
            }
    
            if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer,
                let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
    
                let imageData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: nil)
                let dataProvider = CGDataProvider(data: imageData as! CFData)
    
                let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: CGColorRenderingIntent.absoluteColorimetric)
    
    
                let image = UIImage(cgImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.right)
    
                let cropedImage = self.cropToSquare(image: image)
    
                let newImage = self.scaleImageWith(cropedImage, and: CGSize(width: 600, height: 600))
    
                print(UIScreen.main.bounds.width)
    
    
                self.tempImageView.image = newImage
                self.tempImageView.isHidden = false
    
    
            } else {
    
            }
        }