Byte-beats are a fun way of making lo-fi music. I want to make some music myself using the WebAudio API. Here's my current code:
const sampleRate = 8000;
const frameCount = sampleRate * 5;
const audioCtx = new AudioContext({ sampleRate: sampleRate });
const src = audioCtx.createBufferSource();
const buf = audioCtx.createBuffer(1, frameCount, sampleRate);
buf.getChannelData(0).set(buf.getChannelData(0).map((_, t) => {
return (Math.sin(t / 10 + Math.sin(t * Math.pow(2, t >> 10)))) * 64 + 128;
}));
src.buffer = buf;
src.connect(audioCtx.destination);
src.start(0, 0, 100);
console.log('Reached the end :/');
My issue with this solution is that I've to create an huge buffer which has to be kept in memory. I was hoping that there would be a dynamic way of setting the sound's amplitude to save memory.
The byte-beats will be entire music compositions and can be pretty long. So, the frame counts can become pretty huge.
Can anyone please suggest me how to do this? Using other libraries is an option but I would prefer avoiding that.
That sounds like a good use case for an AudioWorklet. When using an AudioWorklet you only have to provide 128 samples at a time. It runs on another thread for performance reasons. That makes it a bit more complicated to code. Here is a basic example which uses a dynamically created URL to load the code for the AudioWorklet.
const play = async () => {
const audioContext = new AudioContext({ sampleRate: 8000 });
const source = `registerProcessor(
'byte-beats-processor',
class extends AudioWorkletProcessor {
process (_, [ output ]) {
for (let i = 0; i < 128; i += 1) {
const t = currentFrame + i;
output[0][i] = Math.sin(t);
}
return true;
}
}
);`
const blob = new Blob([ source ], { type: 'application/javascript' });
const url = URL.createObjectURL(blob);
await audioContext.audioWorklet.addModule(url);
const audioWorkletNode = new AudioWorkletNode(audioContext, 'byte-beats-processor');
audioWorkletNode.connect(audioContext.destination);
};
play();
Of course using Math.sin(t)
is only an example. You probably want to replace that with something more interesting.
The AudioWorklet is currently only available in Chrome. That means you still need to use the deprecated ScriptProcessorNode for other browsers or you can use a polyfill such as standardized-audio-context which allows you to use the same code for all browsers.