Search code examples
androidandroid-ndksurfaceviewandroid-mediacodec

Mediacodec rendering to surface slow


I'm trying to stream data (h.264 raw 1080p) to android and rendering it to surface view.The problem is that if I send the data faster than 45fps the decoder output is pixelated(the input index & output index are -1 or not ready)

Also if I sent a 720p or a lower resolution video the result is the same I cannot render(without pixelation)faster than 45fps.

But if I set the render flag to "False" in releaseOutputBuffer() I am able to achieve 75fps (the input & output indexes I receives are normal)

So is there a way to "unlock" the frame rate? or another way to render faster.

NOTE: I am doing this inside ndk.

Init decoder ()

AMediaFormat *AVm_format = AMediaFormat_new();
AMediaFormat *AVm_formattesting = AMediaFormat_new();
AVm_codec = AMediaCodec_createDecoderByType("video/avc");
AVm_formattesting =AMediaCodec_getOutputFormat(AVm_codec);
int formatint=0;
AMediaCodec_createCodecByName("OMX.qcom.video.decoder.avc");
    AMediaFormat_setString(AVm_format,AMEDIAFORMAT_KEY_MIME,"video/avc");
    AMediaFormat_setInt32(AVm_format,AMEDIAFORMAT_KEY_HEIGHT,1920);
    AMediaFormat_setInt32(AVm_format,AMEDIAFORMAT_KEY_WIDTH,1080);
        AMediaFormat_setInt32(AVm_format,AMEDIAFORMAT_KEY_COLOR_FORMAT,13);
    try {

        AMediaCodec_configure(AVm_codec, AVm_format, Nwindow, NULL, 0);
        LOGD("Configure finished...\n");
        AMediaCodec_start(AVm_codec);
        LOGD("Decoder started\n");
    }catch(std::exception e){

        LOGD("FAILED TO CONFIGURE DECODER\n");

    }

Decoding(...)

//pData is the frame I recive
//sz is the size of the frame

ssize_t indx = AMediaCodec_dequeueInputBuffer(AVm_codec, 0);


    if (indx >= 0) {
        input = AMediaCodec_getInputBuffer(AVm_codec, indx, &insize);

     // memset(input,0,sz);
        memcpy(input,pData,sz);
        AMediaCodec_queueInputBuffer(AVm_codec, indx, 0, sz, 0, 0);
    }

    ssize_t indy = AMediaCodec_dequeueOutputBuffer(AVm_codec, AVm_buffinfo, 0);

    if (indy >= 0) {
        AMediaCodec_releaseOutputBuffer(AVm_codec, indy, false);
    } else if(indy == AMEDIACODEC_INFO_OUTPUT_BUFFERS_CHANGED){
        LOGD("output buffers changed\n");
    } else if (indy == AMEDIACODEC_INFO_OUTPUT_FORMAT_CHANGED) {
        AMediaFormat *format = NULL;
        format = AMediaCodec_getOutputFormat(AVm_codec);
        LOGD("format changed to: %s", AMediaFormat_toString(format));
        AMediaFormat_delete(format);
        LOGD("format changed to:\n");
    } else if (indy == AMEDIACODEC_INFO_TRY_AGAIN_LATER) {
        LOGD("no output buffer right now\n");

    } else {
        LOGD("unexpected info code: %zd\n", indy);

     }

If anything else is needed let me know please.


Solution

  • I'm not entirely sure what's going on -- the rate at which decoded frames are delivered to SurfaceView should not affect the quality of the output. This sounds more like a problem with the way data is being fed into the decoder, e.g. you're overwriting a buffer of H.264 data that is still being read from.

    Frames sent to SurfaceView's Surface aren't dropped, so your releaseOutputBuffer(..., true) will block if you attempt to feed it frames faster than the device refresh rate. On most devices this is 60fps. You can read more about the way the system works in the graphics architecture doc.

    One thing to bear in mind is that the decoded video frames aren't rendered by releaseOutputBuffer() so much as forwarded. There is some cost for the IPC transaction, but I expect that most of what you're seeing is the effect of the call blocking to maintain a steady 16.7ms per frame.