For a testing purpose I am creating a new video from existing one by using MediaExtractor and MediaMuxer. I expect the new video to be exactly the same duration as the original one but it is not the case. The new video duration is slightly shorter than the original one.
fun test(firstVideo: FileDescriptor, outputFileAbsolutePathUri: String) {
val extractor = MediaExtractor().apply {
this.setDataSource(firstVideo)
}
val muxer = MediaMuxer(outputFileAbsolutePathUri, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4)
try {
val MAX_SAMPLE_SIZE = 20 * 1024 * 1024
val bufferSize: Int = MAX_SAMPLE_SIZE
val dstBuf: ByteBuffer = ByteBuffer.allocate(bufferSize)
val bufferInfo = MediaCodec.BufferInfo()
val indexMap = setMuxerTracks(extractor, muxer)
muxer.start()
muxDataFromExtractor(muxer, extractor, indexMap, dstBuf, bufferInfo)
muxer.stop()
} finally {
extractor.release()
muxer.release()
}
}
private fun setMuxerTracks(extractor: MediaExtractor, muxer: MediaMuxer): Map<Int, Int> {
val indexMap = HashMap<Int, Int>(extractor.trackCount)
for (i in 0 until extractor.trackCount) {
extractor.selectTrack(i)
val format: MediaFormat = extractor.getTrackFormat(i)
val dstIndex = muxer.addTrack(format)
indexMap[i] = dstIndex
}
return indexMap
}
private fun muxDataFromExtractor(muxer: MediaMuxer,
extractor: MediaExtractor,
trackIndexMap: Map<Int, Int>,
dstBuf: ByteBuffer,
bufferInfo: MediaCodec.BufferInfo) {
var sawEOS = false
val initialPresentationTimeUs = bufferInfo.presentationTimeUs
while (!sawEOS) {
bufferInfo.offset = 0
bufferInfo.size = extractor.readSampleData(dstBuf, 0)
if (bufferInfo.size < 0) {
sawEOS = true
bufferInfo.size = 0
} else {
bufferInfo.presentationTimeUs = initialPresentationTimeUs + extractor.sampleTime
bufferInfo.flags = extractor.sampleFlags
val trackIndex = extractor.sampleTrackIndex
muxer.writeSampleData(trackIndexMap[trackIndex]!!, dstBuf, bufferInfo)
extractor.advance()
}
}
}
Just for the sake of comparison the original video duration was 3366666 microsec and the created video duration was 3366366 microseconds. The video length is retrieved from MediaFormat (MediaFormat.KEY_DURATION)
It appears that it depends on the way the source video was created: When I used ffprobe to get source video metadata I got the following:
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'source.mp4':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
encoder : Lavf58.76.100
Duration: 00:00:03.37, start: 0.000000, bitrate: 123 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 400x136, 118 kb/s, 30 fps, 30 tbr, 15360 tbn, 60 tbc (default)
Metadata:
handler_name : Core Media Video
The result video metadata was:
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'result.mp4':
Metadata:
major_brand : mp42
minor_version : 0
compatible_brands: isommp42
creation_time : 2022-01-06T12:01:21.000000Z
com.android.version: 11
Duration: 00:00:03.37, start: 0.000000, bitrate: 126 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 400x136, 118 kb/s, SAR 1:1 DAR 50:17, 30 fps, 30 tbr, 90k tbn, 60 tbc (default)
Metadata:
creation_time : 2022-01-06T12:01:21.000000Z
handler_name : VideoHandle
When I use the video that was created by using MediaExtractor and MediaMuxer as a source the video duration is the same (within 1 microsecond threshold)