Search code examples
wavweb-audio-apiweb-mediarecorder

How do I use the MediaRecorder API on processed audio?


I am taking microphone input and processing it to do a FFT on the data, but the specifics of that are irrelevant for this question. A rough overview of my current code:

const microphone = await navigator.mediaDevices.getUserMedia({video: false, audio: true});
const context = new AudioContext();
const stream = context.createMediaStreamSource(microphone);
const processor = context.createScriptProcessor(BUFFER_BYTES, 1, 1);
const analyser = context.createAnalyser();

// ...

stream.connect(analyser);
analyser.connect(processor);
processor.connect(context.destination);

I would also like to take this audio and record it into a .wav file. How can I do this? Is it possible to duplicate my microphone input stream such that I can process it via the nodes I am currently using, and record it via a MediaRecorder as well? Or can I simply add a MediaRecorder as a node in my audio pipeline?


Solution

  • You can use a MediaStreamAudioDestinationNode to get the processed audio as MediaStream again. It can be created like this:

    const mediaStreamDestination = context.createMediaStreamDestination();
    

    You can then connect your processor to the mediaStreamDestination as well.

    processor.connect(mediaStreamDestination);
    

    The stream provided by the mediaStreamDestination can then be used to create a MediaRecorder.

    const mediaRecorder = new MediaRecorder(mediaStreamDestination.stream);
    

    Unfortunately no browser supports recording wav files out of the box. But I created a package which can be used to "extend" the native MediaRecorder with custom codecs. It's called extendable-media-recorder. There is an example in the readme which shows how it can be used to record wav files.