Using the Web Audio API, I create a bufferSource, and use a new MediaRecorder to record at the same time. I'm recording the sound coming out of my speakers with the built in microphone.
If I play back the original and the new recording, there is a significant lag between the two. (It sounds like about 200ms to me.) If I console.log the value of globalAudioCtx.currentTime at the point of calling the two "start" methods, the two numbers are exactly the same. The values of Date.now() are also exactly the same.
Where is this latency introduced? The latency due to speed of sound is about 1000x as small as what I am hearing.
In short, how can I get these two samples to play back at exactly the same time?
I'm working in Chrome on Linux.
Where is this latency introduced?
Both on the playback and recording.
Your sound card has a buffer, and software has to write audio to that buffer in small chunks at a time. If the software can't keep up, choppy audio is heard. So, buffer sizes are set to be large enough to prevent that from happening.
The same is true on the recording end. If a buffer isn't large enough, recorded audio data would be lost if the software wasn't able to read from that buffer fast enough, causing choppy and lost audio.
Browsers aren't using the lowest latency mode of operation with your sound card. There are some tweaks you can apply (such as using WASAPI and exclusive mode on Windows with Chrome), but you're at the mercy of the browser developers who didn't design this with folks like you and I in mind.
No matter how low you go though in buffer size, there is still going to be lag. That's the nature of computer-based digital audio.
how can I get these two samples to play back at exactly the same time?
You'll have to delay one of the samples to bring them back in sync.