Search code examples
javaandroidencodeandroid-mediacodecmediamuxer

android - setting presentation time of mediacodec


I used bellow code to encode raw data to h264 in order to create video and it's encoded very well but video is playing too fast. It seems that there is a problem with presentation time. when record starts I set the value of "tstart" and for each frame calculate difference of current time from tstart and pass it to the queueinputbuffer but nothing has changed. which part has problem ? I know that in android 4.3 I can pass surface to mediacodec but I want to support android 4.1. thanks in advance.

 public void onPreviewFrame(final byte[] bytes, Camera camera) {
                    if (recording == true) {
                        long time = System.nanoTime();
                        time -= tstart;
                        if(mThread.isAlive()&&recording == true) {
                            encode(bytes, time );

                        }

                    }
 }

private synchronized void encode(byte[] dataInput,long time)
{
    byte[] data=new byte[dataInput.length];
    NV21toYUV420Planar(dataInput,data,640,480);

    inputBuffers = mMediaCodec.getInputBuffers();// here changes
    outputBuffers = mMediaCodec.getOutputBuffers();

    int inputBufferIndex = mMediaCodec.dequeueInputBuffer(-1);

    if (inputBufferIndex >= 0) {
        ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
        inputBuffer.clear();
        inputBuffer.put(data);
        time/=1000;
        mMediaCodec.queueInputBuffer(inputBufferIndex, 0, data.length, time, 0);

    } else {
        return;
    }

    MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
    int outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 0);
    Log.i("tag", "outputBufferIndex-->" + outputBufferIndex);
    do {
        if (outputBufferIndex >= 0) {
            ByteBuffer outBuffer = outputBuffers[outputBufferIndex];
            byte[] outData = new byte[bufferInfo.size];
            outBuffer.get(outData);
            try {
                if (bufferInfo.offset != 0) {
                    fos.write(outData, bufferInfo.offset, outData.length
                            - bufferInfo.offset);
                } else {
                    fos.write(outData, 0, outData.length);
                }
                fos.flush();
                Log.i("camera", "out data -- > " + outData.length);
                mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
                outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo,
                        0);
            } catch (IOException e) {
                e.printStackTrace();
            }
        } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
            outputBuffers = mMediaCodec.getOutputBuffers();
        } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
            MediaFormat format = mMediaCodec.getOutputFormat();
        }
    } while (outputBufferIndex >= 0);
}

Solution

  • Your problem is that you don't write the output frames into a container that actually stores any timestamps at all. You are writing a plain H264 file, which only contains the raw encoded frames, no index, no timestamps, no audio, nothing else.

    In order to get proper timestamps for the files, you need to use MediaMuxer (which appeared in 4.3) or a similar third party library (e.g. libavformat or similar) to write the encoded packets to a file. The timestamp of the output packet is in bufferInfo.presentationTime, and in the if (outputBufferIndex >= 0) { clause, you don't use it at all - you are basically throwing away the timestamps.