Search code examples
androidandroid-mediacodecyuv

Can the Media Codec decoders output RGB-like formats?


I am trying to convert a VP9 video using the Android Media Codec. When I set the KEY_COLOR_FORMAT of the format to something other than YUV formats, I get the following error: "[OMX.qcom.video.decoder.vp9] does not support color format XXXX. "

The COLOR_FormatSurface format for instance, does not seems to be supported. Or I am doing something wrong. Do I need to perform manually a YUV to RGB conversion? If yes what is the purpose of being able to provide a Surfacetexture to the codec?

Here the sample code:

public class VideoDecoder
{
    private MediaCodec mVideoCodec = null;
    private ByteBuffer[] mInputBuffers;
    private ByteBuffer[] mOutputBuffers;

    public VideoDecoder(int width, int height)
    {
        try
        {

            // Media settings
            MediaFormat format = MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_VP9,
                    width,
                    height);


            format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
            // COLOR_FormatYUV420Flexible

            // Configure the decoder
            mVideoCodec =  MediaCodec.createDecoderByType(MediaFormat.MIMETYPE_VIDEO_VP9);

            mVideoCodec.configure(format,
                    null,
                    null,
                    0);

            // Start the decoder
            mVideoCodec.start();

            mInputBuffers = mVideoCodec.getInputBuffers();
            mOutputBuffers = mVideoCodec.getOutputBuffers();
        }
        catch(Exception e)
        {
            Log.e("VideoDecoder", "CreateCodec failed message =" + e.getMessage());
        }
    }

    public void release()
    {
        mVideoCodec.stop();
        mVideoCodec.release();
    }

    public void decode(byte[] rawBuffer, int frameSize)
    {
        //todo
    }
}

Thanks!


Solution

  • From the media codec documentation, the way to go is to use a Surface. It can implicitly do the conversion from the codec output to its texture(BGRA). So the flow would like like:

    1) Call Surface.updateTexImage() on reception of the onFrameAvailable callback. Needs to be called on a thread owning the current graphic context.

    2) Copy the texture data to another texture. Will require to transform texture coordinates according to the Surface transform matrix using a shader.

    3) If required, read back the result on the CPU. For ex with gles20.glreadpixels