Search code examples
iosavfoundationavassetwriter

Explanation for CVBufferPool with AVAssetWriter?


So let's say I want to make a movie from images. I'm told to use AVAssetWriter, along with an AVAssetWriterInput to append CVPixelBuffer objects. But I'm very confused.

Why do we create the pixel buffers, only to create a bitmap context to make a movie, and then draw using drawRectInViewHierarchy?


Solution

  • I'm not sure what information you're basing your question on, but I'll try to explain the basics.

    First, CVPixelBuffer is a CoreVideo object that stores image data. All of the AVFoundation classes that deal with image data use objects of this type. However, a CVPixelBuffer is not a simple object to construct, you can't simply instantiate one from a blob of JPEG or PNG data.

    One possible way of creating a CVPixelBuffer is by calling CVPixelBufferCreateWithBytes from a CGImageDataProvider. There are potentially other solutions that might work and/or be more efficient. It depends on what kind of images you're starting with.