Search code examples
iphoneios4avfoundationvideo-encodinglibav

How to Convert CMSampleBuffer/UIImage into ffmpeg's AVPicture?


I'm trying to encode iPhone's camera frames into a H.264 video using ffmpeg's libav* libraries. I found in this Apple's article how to convert CMSampleBuffer to UIImage, but how can I convert it to ffmpeg's AVPicture?

Thanks.


Solution

  • Answering my own question:

    CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(pixelBuffer, 0);
    
    // access the data
    int width = CVPixelBufferGetWidth(pixelBuffer);
    int height = CVPixelBufferGetHeight(pixelBuffer);
    unsigned char *rawPixelBase = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);
    
    // Do something with the raw pixels here
    // ...
    
    // Fill in the AVFrame
    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
    
    AVFrame *pFrame;
    pFrame = avcodec_alloc_frame();
    
    avpicture_fill((AVPicture*)pFrame, rawPixelBase, PIX_FMT_RGB32, width, height);
    

    Now pFrame is filled in with the content of sample buffer, which is using the pixel format kCVPixelFormatType_32BGRA.

    This solved my issue. Thanks.