Search code examples
javascriptgoogle-chromehtml5-audioweb-mediarecorderrecordrtc

How can I add predefined length to audio recorded from MediaRecorder in Chrome?


I am in the process of replacing RecordRTC with the built in MediaRecorder for recording audio in Chrome. The recorded audio is then played in the program with audio api. I am having trouble getting the audio.duration property to work. It says

If the video (audio) is streamed and has no predefined length, "Inf" (Infinity) is returned.

With RecordRTC, I had to use ffmpeg_asm.js to convert the audio from wav to ogg. My guess is somewhere in the process RecordRTC sets the predefined audio length. Is there any way to set the predefined length using MediaRecorder?


Solution

  • This is a chrome bug.

    FF does expose the duration of the recorded media, and if you do set the currentTimeof the recorded media to more than its actual duration, then the property is available in chrome...

    function exportAudio(blob) {
      const aud = document.getElementById("aud");
      aud.src = URL.createObjectURL(blob);
      aud.addEventListener("loadedmetadata", () => {
        // It should have been already available here
        console.log("duration:", aud.duration);
        // Handle chrome's bug
        if (aud.duration === Infinity) {
          // Set it to bigger than the actual duration
          aud.currentTime = 1e101;
          aud.addEventListener("timeupdate", () => {
            console.log("after workaround:", aud.duration);
            aud.currentTime = 0;
          }, { once: true });
        }
      });
    }
    
    // We need user-activation
    document.getElementById("button").onclick = async ({ target }) => {
      target.remove();
      const resp = await fetch("https://upload.wikimedia.org/wikipedia/commons/4/4b/011229beowulf_grendel.ogg");
      const audioData = await resp.arrayBuffer();
      const ctx = new AudioContext();
      const audioBuf = await ctx.decodeAudioData(audioData);
      const source = ctx.createBufferSource();
      source.buffer = audioBuf;
      const dest = ctx.createMediaStreamDestination();
      source.connect(dest);
    
      const recorder = new MediaRecorder(dest.stream);
      const chunks = [];
      recorder.ondataavailable = ({data}) => chunks.push(data);
      recorder.onstop = () => exportAudio(new Blob(chunks));
      source.start(0);
      recorder.start();
      console.log("Recording...");
      // Record only 5 seconds
      setTimeout(function() {
        recorder.stop();
      }, 5000);
    
    }
    <button id="button">start</button>
    <audio id="aud" controls></audio>

    So the advice here would be to star the bug report so that chromium's team takes some time to fix it, even if this workaround can do the trick...


    2024 update

    Since this answer has been posted it seems unlikely the MediaRecorder API will ever fix this.
    Hopefully in the near future the WebCodecs API will provide a way to do this, with enough browser support, but for now to fix the initial issue you'd need to repack the generated media yourself after the whole media has been recorded. (This means you need to do this where you keep all the chunks, this might be on your server). You may take a look at this answer for one such library (that I didn't test myself) which seems to work in both browsers and node.