Search code examples
javascriptsafariweb-audio-apiaudiocontextwebkitaudiocontext

Safari webkitAudioContext.createBuffer API raises NotSupportedError exception


I am using the JavaScript Web Audio API, AudioContext, to play audio. It works fine with other major browsers, but Safari on MacOS raises NotSupportedError exception when calling the webkitAudioContext.createBuffer API. I found this question, Play PCM with javascript, also indicated that there was such issue with Safari at the end of page. So, I debugged the "Working example https://o.lgm.cl/example.html (16-bit LSB)" from there and hit the same issue with Safari.

As I am still new to StackOverflow, I cannot add comments to that question, asking for how they solved this issue. So, can someone please help? Much appreciated!

Edit:

Run these 2 lines of code in Safari's JavaScript console will reproduce the issue:

var audioCtx = new (window.AudioContext || window.webkitAudioContext)(); 
var myAudioBuffer = audioCtx.createBuffer(1, 48000, 16000); 

> NotSupportedError: The operation is not supported.

Solution

  • The error you are getting is almost expected. The Web Audio spec says a NotSupportedError must be thrown if the sampleRate is outside of the supported range. But it also says the lowest supported sampleRate should be at least 8000 Hz.

    https://webaudio.github.io/web-audio-api/#dom-baseaudiocontext-createbuffer

    Safari's Web Audio implementation does only support AudioBuffers with 22050 Hz and more. I would therefore suggest to create an AudioBuffer at 32000 Hz because 32 is a multiple of 16 which makes the next step a bit easier to reason about.

    When filling the buffer you need to compensate for the larger sampleRate by interpolating the missing values yourself. I think a basic linear interpolation should work reasonably well. But you can also use an OfflineAudioContext to resample your AudioBuffer.

    In a perfect world (e.g. Firefox, Chrome or Opera) you can resample an AudioBuffer like this:

    // Let's assume you have an AudioBuffer called audioBuffer of 1 second at 16 kHz.
    const offlineAudioContext = new OfflineAudioContext(
        { length: 32000, sampleRate: 32000 }
    );
    const audioBufferSourceNode = new AudioBufferSourceNode(
        offlineAudioContext,
        { buffer: audioBuffer }
    );
    
    audioBufferSourceNode.start(0);
    audioBufferSourceNode.connect(offlineAudioContext.destination);
    
    const resampledAudioBuffer = await offlineAudioContext.startRendering();
    

    The variable resampledAudioBuffer will now reference a resampled AudioBuffer at 32 kHz.

    But the implementation of the Web Audio API in Safari is outdated and buggy. Not only does it not support to create AudioBuffers with less than 22050 Hz, it also can't create an OfflineAudioContext with less than 44100 Hz.

    However, all you need for your use case is to resample your data by a factor of two. It's theoretically the same to resample from 16 kHz to 32 kHz or from 44100 Hz to 88200 Hz.

    You can therefore create an AudioBuffer at 44100 Hz and fill it with your data which is actually at 16 kHz. Then you resample that buffer to 88200 Hz. The resulting data will then be your original data in 32 kHz.

    That's all very complicated but unfortunately I don't know of any other way to do it in Safari.

    To avoid the need to use the outdated syntax which is still necessary for Safari, I would recommend to use a polyfill. I'm the author of standardized-audio-context which is why I would recommend that, but it is not the only one.