Search code examples
iosobjective-cswiftavfoundationios-camera

AVCaptureVideoDataOutputSampleBufferDelegate.CaptureOutput not called


I currently have a self-developed framework (MySDK), and an iOS app (MyApp) that uses MySDK.

Inside of MySDK, I have a class (Scanner) in MySDK that processes images from the video output of the device camera.

Here's a sample of my code:

Scanner.swift

class Scanner: NSObject, AVCaptureVideoDataOutputSampleBufferDelegate {

    var captureDevice : AVCaptureDevice?
    var captureOutput : AVCaptureVideoDataOutput?
    var previewLayer : AVCaptureVideoPreviewLayer?
    var captureSession : AVCaptureSession?

    var rootViewController : UIViewController?

    func scanImage (viewController: UIViewController)
    {
        NSLog("%@", "scanning begins!")

        if (captureSession == nil) { captureSession = AVCaptureSession() }

        rootViewController = viewController;

        captureSession!.sessionPreset = AVCaptureSessionPresetHigh

        let devices = AVCaptureDevice.devices()

        for device in devices {
            if (device.hasMediaType(AVMediaTypeVideo)) {
                if(device.position == AVCaptureDevicePosition.Back) {
                    captureDevice = device as? AVCaptureDevice
                }
            }
        }

        if (captureDevice != nil) {
            NSLog("%@", "beginning session!")

            beginSession()
        }
    }

    func beginSession()
    {
        if (captureSession == nil) { captureSession = AVCaptureSession() }
        if (captureOutput == nil) { captureOutput = AVCaptureVideoDataOutput() }
        if (previewLayer == nil) { previewLayer = AVCaptureVideoPreviewLayer() }

        let queue = dispatch_queue_create("myQueue", DISPATCH_QUEUE_SERIAL);

        captureOutput!.setSampleBufferDelegate(self, queue: queue)
        captureOutput!.videoSettings = [kCVPixelBufferPixelFormatTypeKey as NSString: Int(kCVPixelFormatType_32BGRA)]

        captureSession!.addInput(try! AVCaptureDeviceInput(device: captureDevice))
        captureSession!.addOutput(captureOutput)

        previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        previewLayer!.frame = rootViewController!.view.layer.frame

        rootViewController!.view.layer.addSublayer(previewLayer!)

        captureSession!.startRunning()
    }

    func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBufferRef!, fromConnection connection: AVCaptureConnection!)
    {
        NSLog("%@", "captured!")
    }
}

Inside MyApp, I have a ViewController which implements an IBAction, in which the Scanner class is initialized, and the scanImage function is triggered.

MyApp.m:

- (IBAction)btnScanImage_TouchDown:(id)sender
{
    Scanner * scanner = [[Scanner alloc] init];

    [scanner scanImage:self];
}

The camera view comes up inside of the app, but the captureOutput function is never fired, and the console only contains these two lines:

2016-03-07 11:11:45.860 myapp[1236:337377] scanning begins!
2016-03-07 11:11:45.984 myapp[1236:337377] beginning session!

Creating a standalone app, and embedding the code inside Scanner.swift into a ViewController works just fine; the captureOutput function fires properly.

Does anyone have any idea what I am doing wrong here?


Solution

  • After much trial and error, I have finally found a solution to my problem.

    Apparently, I was not creating the Scanner object as a class variable, only as a local variable.

    Once the Scanner object was created as a class variable, the delegate method captureOutput was fired properly.