Search code examples
iosswiftcamerauikitavfoundation

AVFoundation can't store the real time CMSample Buffer into array


I modified this Apple sample code AVcam, what I want to do is to store the latest 20 CMSampleBuffer into an array.

I add an AVCaptureVideoDataOutputSampleBufferDelegate protocal in CameraViewController, create a new VideoDataOutput and connect it. Then set the delegate to self.

In the callback function

var bufferArray:[CMSampleBuffer] = []
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        if bufferArray.count > 20{
           bufferArray.removeFirst()
         }
         bufferArray.append(sampleBuffer)
    }

But After 5-10 times calling it,(based on the preset of session), It will never call that again. And In my another camera app, the preview also stuck.

The weird thing is if I only hold one reference of the buffer, like changing the code into singlebuffer = sampleBuffer, everything works fine. I've changed many capture settings but none of them work.

Update: In the diddrop call back function, it shows the reason DroppedFrameReason(P) = OutOfBuffers,apple document said that:

The module providing sample buffers has run out of source buffers. This condition is typically caused by the client holding onto buffers for too long and can be alleviated by returning buffers to the provider.


Solution

  • I tried to convert the buffer into CVPixelBuffer, CIImage, CGImage

            let image = CIImage(cvPixelBuffer: CMSampleBufferGetImageBuffer(cmsampleBuffer)!)
            create cgimage
            let context = CIContext(options: nil)
            let cgImage = context.createCGImage(image, from: image.extent)
            liveArray.append(cgImage!)
        }
    

    Until converted to CGImage, the variable didn't hold the reference of buffer anymore.