Search code examples
javascriptweb-audio-api

Where does the property "stream" come from in regards to createMediaStreamSource?


W3.org has the following example: https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/webrtc-integration.html

<canvas id="c"></canvas>
<script>
    navigator.getUserMedia('audio', gotAudio);
    var streamRecorder;
    function gotAudio(stream) {
        var microphone = context.createMediaStreamSource(stream);
        var analyser = context.createAnalyser();
        microphone.connect(analyser);
        analyser.connect(context.destination);
        requestAnimationFrame(drawAnimation);

        streamRecorder = stream.record();
        peerConnection.addStream(stream);
    }
</script>

What is "stream"? Where does this property come from? What happens when it's just being placed like this? I don't understand how it's defined.


Solution

  • navigator.getUserMedia('audio', gotAudio); getUserMedia Prompts the user for permission to use one video and/or one audio input device such as a camera or screensharing and/or a microphone.

    in your case you are requesting the audio permissions. If the getUserMedia call succeeded then a function will be called with the name "gotAudio" and it will bring the stream with it.

    So once you get the successful access to audio then you can use microphone or any other audio related devices.

    in the gotAudio function the code records the audio steam using microphone.

    you can also use the failure call back : navigator.getUserMedia(constraints, successCallback, errorCallback);