The following setup has been working on all recent iOS versions until iOS10:
I am using AVSampleBufferDisplayLayer
to render raw frames from a custom source.
I have a pixel buffer pool set up using CVPixelBufferPoolCreate
, and have the kCVPixelBufferIOSurfacePropertiesKey
set to @{}
as instructed by Apple.
I use CVPixelBufferPoolCreatePixelBuffer
to obtain a pixel buffer from the pool and then copy my data to the buffer by using CVPixelBufferLockBaseAddress
and CVPixelBufferUnlockBaseAddress
.
My raw frames use the NV12 format kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange
.
Here is a code snippet showing how I convert the pixel buffer to a CMSampleBufferRef
and enqueue it to the display layer:
CMSampleTimingInfo sampleTimeinfo{
CMTimeMake(duration.count(), kOneSecond.count()),
kCMTimeInvalid,
kCMTimeInvalid};
CMFormatDescriptionRef formatDescription = nullptr;
CMVideoFormatDescriptionCreateForImageBuffer(nullptr, pixelBuffer, &formatDescription);
CMSampleBufferRef sampleBuffer = nullptr;
CMSampleBufferCreateForImageBuffer(
nullptr, pixelBuffer, true, nullptr, nullptr, formatDescription, &sampleTimeinfo, &sampleBuffer));
CFArrayRef attachmentsArray = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, YES);
const CFIndex numElementsInArray = CFArrayGetCount(attachmentsArray);
for (CFIndex i = 0; i < numElementsInArray; ++i) {
CFMutableDictionaryRef attachments = (CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachmentsArray, i);
CFDictionarySetValue(attachments, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanTrue);
}
if ([avfDisplayLayer_ isReadyForMoreMediaData]) {
[avfDisplayLayer_ enqueueSampleBuffer:sampleBuffer];
}
CFRelease(sampleBuffer);
CFRelease(formatDescription);
pixelBuffer
is of type CVPixelBufferRef
, and avfDisplayLayer_
AVSampleBufferDisplayLayer
.
This next snippet shows how I construct the display layer:
avfDisplayLayer_ = [[AVSampleBufferDisplayLayer alloc] init];
avfDisplayLayer_.videoGravity = AVLayerVideoGravityResizeAspectFill;
I am not getting any warning or error messages, the display layer status does not indicate a failure and isReadyForMoreMediaData
is returning true.
The problem is that my frames do not show on the screen. I have also set a background color on the display layer, just to make sure the layer is composited correctly (which it is).
Something must have changed in iOS10 with regards to the AVSampleBufferDisplayLayer, but I am unable to figure out what it is.
It turns out that with iOS10, the values for CMSampleTimingInfo are apparently parsed more stringently.
The above code was changed to the following to make rendering work correctly once more:
CMSampleTimingInfo sampleTimeinfo{
CMTimeMake(duration.count(), kOneSecond.count()),
kCMTimeZero,
kCMTimeInvalid};
Please note the kCMTimeZero
for the presentationTimeStamp
field.
@Sterling Archer: You may want to give this a try to see if it addresses your problem as well.