Search code examples
javascripthtmlweb-audio-api

Concerning Web Audio nodes, what does .connect() do?


Trying to follow the example here, which is basically a c&p of this
Think I got most of the parts down, except all the node.connect()'s

From what I understand, this sequence of code is needed to provide the audio analyzer with an audio stream:

     var source = audioCtx.createMediaStreamSource(stream);
     source.connect(analyser);
     analyser.connect(audioCtx.destination);

I can't seem to make sense of it as it looks rather ouroboros-y to me.
And unfortunately, I can't seem to find any documentation on .connect() so quite lost and would appreciate any clarification!

Oh and I'm loading an .mp3 via pure javascript new Audio('db.mp3').play(); and am trying to use that as the source without creating an <audio> element.
Can a mediaStream object be created from this to feed into .createMediaStreamSource(stream)?


Solution

  • connect simply defines the output for the filters. In this case, your source loads the stream into the buffer and writes to the input of the next filter which is defined by the connect function. This is repeated for your analyser filter.

    Think of it as pipes.

    here is a sample code snippet that I have written a few years back using web audio api.

            this.scriptProcessor = this.audioContext.createScriptProcessor(this.scriptProcessorBufferSize,
                                                                            this.scriptProcessorInputChannels,
                                                                            this.scriptProcessorOutputChannels);
            this.scriptProcessor.connect(this.audioContext.destination);
            this.scriptProcessor.onaudioprocess = updateMediaControl.bind(this);
    
            //Set up the Gain Node with a default value of 1(max volume).
            this.gainNode = this.audioContext.createGain();
            this.gainNode.connect(this.audioContext.destination);
            this.gainNode.gain.value = 1;
    
    
    sewi.AudioResourceViewer.prototype.playAudio = function(){
        if(this.audioBuffer){
            this.source = this.audioContext.createBufferSource();
            this.source.buffer = this.audioBuffer;
            this.source.connect(this.gainNode);
            this.source.connect(this.scriptProcessor);
            this.beginTime = Date.now();
            this.source.start(0, this.offset);
            this.isPlaying = true;
            this.controls.update({playing: this.isPlaying});
            updateGraphPlaybackPosition.call(this, this.offset);
        }
    };
    

    So as you can see that my source is connected to a gainNode, which is connected to a scriptProcessor. When the audio starts playing, the data is passed from the source->gainNode->destination and source->scriptProcessor->destination. flowing through the "pipes" that connects them, which is defined by connect(). When the audio data pass through the gainNode, volume can be adjusted by changing the amplitude of the audio wave. After that it is passed to the script processor so that events can be attached and triggered while the audio is being processed.