Search code examples
htmlgoogle-chromewebrtcgetusermediaweb-mediarecorder

MediaRecorder timeslice segments - only the first segment plays


I have the following on chrome latest:

 var options = { mimeType: "video/webm;codecs=vp8" };
 internalMediaRecorder = new MediaRecorder(internalStream, options);
 internalMediaRecorder.ondataavailable = function (blob) {
        // put blob.data into an array
        var src = URL.createObjectURL(blobData.segment);
        const $container = $("body");
        const $video = $("<video id='" + blobData.ts + "-" + blob.data.size + "' controls src='" + src + "'></video>").css("max-width", "100%");
        $container.prepend($video);
        // if I stop/start the recorder, I get playable segments here, separated by unplayable mini-segments from onDataAvailable because I call stop right after processing a video.  I can "approximate" desired behavior by doing this and then ignoring blobs that are less than some threshhold to ignore the "dead gap" segments.
 }
 internalMediaRecorder.start(segmentLengthInMs); // every 5s

I then compile an array of 5s segments - the blob data is available. However when I create a URL for each of these segments:

 URL.createObjectURL(videoSegment)

Only the first video plays. Why is this?

UPDATE

If I stop/start the recorder in onDataAvailable, I get playable segments here, separated by unplayable mini-segments from onDataAvailable because I call stop right after processing a video. I can "approximate" desired behavior by doing this and then ignoring blobs that are less than some threshhold to ignore the "dead gap" segments. This smells like feet though and I'd like to get proper segmentation working if possible.


Solution

  • Its expected as per the spec

    The UA MUST record stream in such a way that the original Tracks can be retrieved at playback time. When multiple Blobs are returned (because of timeslice or requestData()), the individual Blobs need not be playable, but the combination of all the Blobs from a completed recording MUST be playable.

    The resulted blobs are not raw video data, they are encoded with requested MIME type. So you need merge all the blobs in correct order, to generate a playable video file.

    var options = { mimeType: "video/webm;codecs=vp8" };
    var recordedBlobs = [];
    internalMediaRecorder = new MediaRecorder(internalStream, options);
    internalMediaRecorder.ondataavailable = function (event) {
        if (event.data && event.data.size > 0) {
            recordedBlobs.push(event.data);
        }
    }
    internalMediaRecorder.start(segmentLengthInMs); // every 5s
    
    function play() {
       var superBuffer = new Blob(recordedBlobs, {type: 'video/webm'});
       videoElement.src = window.URL.createObjectURL(superBuffer);
    }
    

    See the demo