I am trying to decode h264 encoded camera frames streamed from an android device on a windows pc. I am using the MediaFoundation H264 decoder to decode every frame sent from the android device. However I keep on getting the error 'more input samples are required to process output error'.
The following is the requirement of the MF H264 Decoder:
Media samples contain H.264 bitstream data with start codes and has interleaved SPS/PPS. Each sample contains one complete picture, either one field or one frame.
I was wondering if this is compatible with what i am sending from the android device.
I am using the code below to send frames to the windows device:
private static class AvcEncoder {
private MediaCodec mediaCodec;
final static int FRAME_RATE = 15;
final static int MOTION_RANK = 2;
public AvcEncoder(int width, int height) {
mediaCodec = MediaCodec.createEncoderByType("video/avc");
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", width, height);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, (int)(width*height*FRAME_RATE*MOTION_RANK*0.07));
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mediaCodec.start();
}
public void close() {
try {
mediaCodec.stop();
mediaCodec.release();
outputStream.flush();
outputStream.close();
} catch (Exception e){
e.printStackTrace();
}
}
//frames are sent here
public void offerEncoder(byte[] input) {
try {
ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
int inputBufferIndex = mediaCodec.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(input);
mediaCodec.queueInputBuffer(inputBufferIndex, 0, input.length, 0, 0);
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo,0);
while (outputBufferIndex >= 0) {
ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
byte[] outData = new byte[bufferInfo.size];
outputBuffer.get(outData);
// write frame length
byte frameLength[] = ByteBuffer.allocate(4).putInt(outData.length).array();
outputStream.write(frameLength, 0, 4);
// write the actual frame
outputStream.write(outData, 0, outData.length);
Log.i("AvcEncoder", outData.length + " bytes written");
mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
}
} catch (Throwable t) {
t.printStackTrace();
}
}
If required I can also post the Windows code.
Also, When i save the streamed data to a file, i am able to play it in VLC, so VLC is able to understand and decode it quite well.
Well, the fix was very simple. I don't know why but Media Foundation requires you to set either the duration or time-stamp of each sample fed to the decoder. The output however doesn't seem to be effected in any way by these two parameters.
Edit:
Note that the InputProcess
method of the IMFTransform
should return MF_E_NO_SAMPLE_DURATION
or MF_E_NO_SAMPLE_TIMESTAMP
if sample duration or timestamp is not set in the IMFSample
object passed to the InputProcess
method. However, in my case it didn't return any such values.