Search code examples
objective-ciosvideocore-animationavfoundation

Frame synchronization with AVPlayer


I'm having an issue syncing external content in a CALayer with an AVPlayer at high precision.

My first thought was to lay out an array of frames (equal to the number of frames in the video) within a CAKeyframeAnimation and sync with an AVSynchronizedLayer. However, upon stepping through the video frame-by-frame, it appears that AVPlayer and Core Animation redraw on different cycles, as there is a slight (but noticeable) delay between them before they sync up.

Short of processing and displaying through Core Video, is there a way to accurately sync with an AVPlayer on the frame level?

Update: February 5, 2012

So far the best way I've found to do this is to pre-render through AVAssetExportSession coupled with AVVideoCompositionCoreAnimationTool and a CAKeyFrameAnimation.

I'm still very interested in learning of any real-time ways to do this, however.


Solution

  • What do you mean by 'high precision?'

    Although the docs claim that an AVAssetReader is not designed for real-time usage, in practice I have had no problems reading video in real-time using it (cf https://stackoverflow.com/a/4216161/42961). The returned frames come with a 'Presentation timestamp' which you can fetch using CMSampleBufferGetPresentationTimeStamp.

    You'll want one part of the project to be the 'master' timekeeper here. Assuming your CALayer animation is quick to compute and doesn't involve potentially blocky things like disk access, I'd use that as the master time source. When you need to draw content (eg in the draw selector on your UIView subclass) you should read currentTime from the CALayer animation, if necessary proceed through the AVAssetReader's video frames using copyNextSampleBuffer until CMSampleBufferGetPresentationTimeStamp returns >= currentTime, draw the frame, and then draw the CALayer animation content over the top.