Search code examples
javascriptweb-audio-api

How to get data in AudioBuffer based on time in seconds?


I know how to decodeAudioData from AudioContext.
What i do not know is how to process the returned AudioBuffer from decodeAudioData.

Let's assume we have some returned AudioBuffer

{
    length: 12012146, 
    duration: 250.25304166666666, 
    sampleRate: 48000, 
    numberOfChannels: 2
}

Now i can getChannelData() from AudioBuffer which gives me an array about the length of AudioBuffer.length.
I know the duration in AudioBuffer is AudioBuffer.length / AudioBuffer.sampleRate.

What i would want is to create a function which expects a time variable (in seconds) and then returns the data from the AudioBuffer.getChannelData() based on that variable.
So i would call getChannelDataFromTime(seconds) and it would return me an array of (for me) unknown size.

Am I missing something that would make my work easier?


Solution

  • The method you are looking for does already exist. It's at least a very similar function. It's called copyFromChannel(). It will copy a portion of the channel data into the given Float32Array.

    Let's say you want to copy the channel data of the first second. In that case you need to create a Float32Array first which can hold the samples for one second.

    const seconds = 1;
    const channelData = new Float32Array(Math.round(audioBuffer.sampleRate * seconds));
    

    Next you can copy the data into that Float32Array.

    audioBuffer.copyFromChannel(channelData, 0, 0);
    

    The first numeric parameter defines the channel to copy the data from and the second numeric parameter is defining the offset in samples.