Search code examples
javascriptaudio-streamingweb-audio-api

Real time recording and playing audio as buffer


the title is indicating what I want, I just want to record audio as buffer and play that byte with only javascript at the same time (not node js) . I researched about that for a while, and finally, I tried this method, and works for recording

        const handleSuccess = function(stream) {
        const context = new AudioContext();
        const source = context.createMediaStreamSource(stream);
        const processor = context.createScriptProcessor(1024*4, 1, 1);

        source.connect(processor);
        processor.connect(context.destination);

        processor.onaudioprocess = function(e) {
            
            console.log(e);
        };
    };

    navigator.mediaDevices.getUserMedia({ audio: true, video: false })
        .then(handleSuccess);

now the problem is playing this e I tried decoding it with context.decodeAudioData(e.inputBuffer.getChannelData(0).buffer) but it throws an error :

DOMException: Failed to execute 'decodeAudioData' on 'BaseAudioContext': Unable to decode audio data

why it's not working? e.inputBuffer.getChannelData(0).buffer returns an object of type ArrayBuffer which is exactly what decodeAudioData wants, I also ensured that the output array is not empty.

Please help me to solve this problem. Thank you


Solution

  • The audioprocess event of a ScriptProcessorNode gives you access to the current audio as an AudioBuffer. If you want to play that AudioBuffer you can use it as is and don't need to decode it anymore. It's already decoded. It can be played with an AudioBufferSourceNode.

    The ScriptProcessorNode is officially deprecated but I guess it will never go away. Its successor - the AudioWorkletProcessor - gives you access to the raw channel data.