Because of a bug in Safari 15 that sometimes causes AudioContext.decodeAudioData
to fail (see Safari 15 fails to decode audio data that previous versions decoded without problems) for normal MP3 files I'm trying to do a workaround. The workaround is decoding the files with the library https://github.com/soundbus-technologies/js-mp3 , then creating an AudioBuffer
from that data and playing that.
The problem is that js-mp3 returns one ArrayBuffer
with PCM data, and creating an AudioBuffer
requires two seperate arrays, one for each channel, and the sampleRate and sample frame length. What I've got so far is:
function concatTypedArrays(a, b) { // a, b TypedArray of same type
var c = new (a.constructor)(a.length + b.length);
c.set(a, 0);
c.set(b, a.length);
return c;
};
// responseData is an ArrayBuffer with the MP3 file...
let decoder = Mp3.newDecoder(responseData);
let pcmArrayBuffer = decoder.decode();
//Trying to read the frames to get the two channels. Maybe get it correctly from
//the pcmArrayBuffer instead?
decoder.source.pos = 0;
let left = new Float32Array(), right = new Float32Array();
console.log('Frame count: ' + decoder.frameStarts.length);
let result;
let i = 0;
let samplesDecoded = 0;
while (true) {
let result = decoder.readFrame();
if (result.err) {
break;
} else {
console.log('READ FRAME ' + (++i));
samplesDecoded += 1152; //Think this is the right sample count per frame for MPEG1 files
left = concatTypedArrays(left, decoder.frame.v_vec[0]);
right = concatTypedArrays(left, decoder.frame.v_vec[1]);
}
}
let audioContext = new AudioContext();
let buffer = audioContext.createBuffer(2, samplesDecoded, decoder.sampleRate);
let source = audioContext.createBufferSource();
source.buffer = buffer;
source.connect(audioContext.destination);
source.start(0);
Now, this sort of works, in that I do hear sounds and I can hear they are the right sounds but they are weirdly distorted. An example sound file I'm trying to play is https://cardgames.io/mahjong/sounds/selecttile.mp3
Any ideas what is wrong here? Or how to convert the single PCM array buffer that is returned from the .decode()
function correctly to the format needed to play it properly?
The example that fdcpp linked above shows that the ArrayBuffer
returned by decoder.decode()
can be used to write it to a WAV file without any further modification. This means the data must be interleaved PCM data.
It should therefore work when converting the data back to floating point values. Additionally it must be put into planar arrays as expected by the Web Audio API.
const interleavedPcmData = new DataView(pcmArrayBuffer);
const numberOfChannels = decoder.frame.header.numberOfChannels();
const audioBuffer = new AudioBuffer({
length: pcmArrayBuffer.byteLength / 2 / numberOfChannels,
numberOfChannels,
sampleRate: decoder.sampleRate
});
const planarChannelDatas = [];
for (let i = 0; i < numberOfChannels; i += 1) {
planarChannelDatas.push(audioBuffer.getChannelData(i));
}
for (let i = 0; i < interleavedPcmData.byteLength; i += 2) {
const channelNumber = i / 2 % numberOfChannels;
const value = interleavedPcmData.getInt16(i, true);
planarChannelDatas[channelNumber][Math.floor(i / 2 / numberOfChannels)]
= value < 0
? value / 32768
: value / 32767;
}