I have a web app that plays audio samples successfully on all platforms except iOS.
Here's my minimal reproducible example, with additional context below.
const AudioContext = window.AudioContext || window.webkitAudioContext;
const audioContext = new AudioContext();
async function loadAudioSample(url) {
const response = await fetch(url);
const arrayBuffer = await response.arrayBuffer();
const audioBuffer = await audioContext.decodeAudioData(arrayBuffer);
return audioBuffer;
}
function playSample(audioBuffer) {
const source = audioContext.createBufferSource();
source.buffer = audioBuffer;
source.connect(audioContext.destination);
source.start(0);
}
// make sure user interacts with document first
document.addEventListener('click', async () => {
let buf = await loadAudioSample('./sample.mp3');
if (audioContext.state === 'suspended') {audioContext.resume();}
// logging comes up normal
console.log(audioContext.state, buf);
playSample(buf);
});
Additional notes:
sample.mp3
from an audio element via new Audio()
using .play()
it does work, so I don't think the file format is an issue. (But I need the Web Audio API for higher precision and synchronizability)..wav
file encoded at 44.1kHz
and 48kHz
.audioContext.state
logs "running" right away on mac, but "suspended" for the first couple of presses on iOS -- but I'm having trouble pinning it down. Even when "running" on iOS, no sound.What am I doing wrong?
I can't believe I'm writing this, but the mute/silent switch was on. I checked the volume a million times, but it's apparently separate from the switch.
On the other hand, there are things I still don't understand: other websites like YouTube played audio during my testing; how? Maybe because those are HTML5 new Audio()
.play()
elements, instead of using the Web Audio API?