I am making an application that reads and plays two audio files.
CodeSnadBox
The above CodeSandBox has the following specifications.
When playing audio, there is sometimes a delay.
However, there is not always an audio delay, and there are times when two tracks can be played back at exactly the same time.
Although not implemented in the CodeSandBox above, the application I am currently working on implements a seek bar to indicate the current playback position. By moving the seek bar to indicate the current playback position, the audio is reloaded and the resulting delay may be cured. On the other hand, moving the seek bar may cause a delay even though the audio was playing at exactly the same timing.
Anyway, is there a way to play multiple audio tracks at the same time in a stable and consistent manner?
let ctx,
tr1,
tr2,
tr1gain = 0,
tr2gain = 0,
start = false;
const trackList = ["track1", "track2"];
const App = () => {
useEffect(() => {
ctx = new AudioContext();
tr1 = ctx.createBufferSource();
tr2 = ctx.createBufferSource();
tr1gain = ctx.createGain();
tr2gain = ctx.createGain();
trackList.forEach(async (item) => {
const res = await fetch("/" + item + ".mp3");
const arrayBuffer = await res.arrayBuffer();
const audioBuffer = await ctx.decodeAudioData(arrayBuffer);
item === "track1"
? (tr1.buffer = audioBuffer)
: (tr2.buffer = audioBuffer);
});
tr1.connect(tr1gain);
tr1gain.connect(ctx.destination);
tr2.connect(tr2gain);
tr2gain.connect(ctx.destination);
return () => ctx.close();
}, []);
const [playing, setPlaying] = useState(false);
const playAudio = () => {
if (!start) {
tr1.start();
tr2.start();
start = true;
}
ctx.resume();
setPlaying(true);
};
const pauseAudio = () => {
ctx.suspend();
setPlaying(false);
};
const changeVolume = (e) => {
const target = e.target.ariaLabel;
target === "track1"
? (tr1gain.gain.value = e.target.value)
: (tr2gain.gain.value = e.target.value);
};
const Inputs = trackList.map((item, index) => (
<div key={index}>
<span>{item}</span>
<input
type="range"
onChange={changeVolume}
step="any"
max="1"
aria-label={item}
/>
</div>
));
return (
<>
<button
onClick={playing ? pauseAudio : playAudio}
style={{ display: "block" }}
>
{playing ? "pause" : "play"}
</button>
{Inputs}
</>
);
};
When calling start()
without a parameter it's the same as calling start with currentTime
of the AudioContext
as the first parameter. In your example that would look like this:
tr1.start(tr1.context.currentTime);
tr2.start(tr2.context.currentTime);
By definition the currentTime
of an AudioContext
increases over time. It's totally possible that this happens between the two calls. Therefore a first attempt to fix the problem could be to make sure both function calls use the same value.
const currentTime = tr1.context.currentTime;
tr1.start(currentTime);
tr2.start(currentTime);
Since currentTime
usually increases by the time of a render quantum you could add an extra safety net by adding a little delay.
const currentTime = tr1.context.currentTime + 128 / tr1.context.sampleRate;
tr1.start(currentTime);
tr2.start(currentTime);
If this doesn't help you could also use an OfflineAudioContext
to render your mix upfront into a single AudioBuffer
.