Search code examples
audiohtml5-audioweb-audio-api

Web audio API: scheduling sounds and exporting the mix


I've been checking Web Audio API documentation and the tutorials but haven't quiet figured out how to approach this problem.

Let's say I load few wav files via XMLHttpRequest and then create buffersources. I know I can schedule when the playback starts precisely. But what if I don't want to play them, but instead want to store and schedule them in a buffer.

A real example: I want to create a simple sequencer where you schedule drums and than export the whole mix to wav (without recording it using RecorderJS or something). Any ideas, libraries?


Solution

  • Just did something a bit like this.

    Essentially, you'll need to create an offline context:

    var offline = new webkitOfflineAudioContext(numChannels, lengthInSamples, sampleRate)
    

    You'll have to recreate all your BufferSources using this new context:

    var newBufferSource = offline.createBufferSource();
    newBufferSource.buffer = someAudioBuffer;
    newBufferSource.connect(offline.destination);
    

    Then schedule your playback:

    newBufferSource.start(offline.currentTime + 10);
    

    Then bind to the complete event for your offline rendering:

    offline.onComplete = function( ev ){
      doSomething(ev.renderedBuffer);
    }
    

    Then start 'rendering':

    offline.startRendering();
    

    Once you have ev.renderedBuffer, you can do whatever you want with it. In my app, I have a WAV encoder that I ended up writing myself - but you could modify Recorder.js to do the same thing pretty easily.

    Just a heads-up: webkitOfflineAudioContext is Chrome-only at the moment. Here's a link if you're interested: OfflineAudioContext