Search code examples
androidopengl-esprocessingandroid-video-playerstagefright

Decoding and Rendering Video on Android


What I need to do is to decode video frames and render the frames on a trapezoidal surface. I'm using Android 2.2 as my development platform

I'm not using the mediaplayer service since I need access to the decoded frames.

Here's what I have so far:

  • I am using stagefright framework to extract decoded video frames.
  • each frame is then converted from YUV420 to RGB format
  • the converted frames are then copied to a texture and rendered to an OpenGL surface
  • Note that I am using Processing and not using OpenGL calls directly.

So now my problems are

  • i can only decode mp4 files with stagefright
  • the rendering is too slow, around 100ms for a 320x420 frame
  • there is no audio yet, I can only render videos but I still don't know how to synchronize the playing of the audio frames.

So for my questions...

  • how can I support other video formats? Shoud I use stagefright or should I switch to ffmpeg?
  • how can I improve the performance? I should be able to support at least 720p?
  • Should I use OpenGL calls directly instead of Processing? Will this improve the performance?
  • How can I sync the audio frames during playback?

Solution

  • Adding other video formats and codecs to stagefright

    If you have parsers for "other" video formats, then you need to implement Stagefright media extractor plug-in and integrate into awesome player. Similarly if you have OMX Components for required Video Codecs, you need to integrate them into OMXCodec class. Using FFMPEG components in stagefright, or using FFMPEG player instead of stagefright does not seem trivial. However if required formats are already available in Opencore, then you can modify Android Stack so that Opencore gets chosen for those formats. You need to port the logic of getting YUV data to Opencore. (get dirty with MIOs)

    Playback performance

    The surface flinger, used for normal playback uses Overlay for rendering. It usually provides around 4 - 8 video buffers (so far what I have seen). So you can check how many different buffers you are getting in OPEN GL rendering. Increasing buffer will definitely improve the performance. Also, check time taken for YUV to RGB conversion. Can optimize or use opensource library to improve performance. Usually Open GL is not used for Video Rendering (known for Graphics). So not sure on the performance.

    Audio Video Sync

    Audio time is used as reference. In Stagefright, awesome player uses Audio Player for playing out audio. This player implements an interface for providing time data. Awesome player uses this for rendering Video. Basically Video frames are rendered when their presentation time matches with that of audio sample being played.

    Shash