Search code examples
javascriptweb-audio-api

How to have a different sound on each ear in js?


I need to send different sounds on the left and right ear but I don't succeed to do that. I followed those steps :

  1. loaded my sounds (sentences and one pure tone called 'bip'),
  2. used getChannelData() to work on the raw data (Float32Array) : apply a gain to one of the three sentences I use and do the sum in the variable 'source' to hear the 3 sentences simultaneously.
  3. used createBuffer(1,source.length,fs) to turn my source variable from type Float32Array to AudioBuffer.
  4. finally used createBufferSource() where I can put my buffer to play it.
  5. It worked well but now I need to play those sentence in the 'source' variable in the left ear and the 'bip' sound in the variable of the same name in the right ear. So, I did the modifications you can bellow to put the source variable and the bip variable in one different channel. The problem is that when I hear it sentences are not completely on the left and bip is not completely on the right. I hear that it's not completely mixed but it's not completely separate. I don't understand why and how to fix it. Do js mix the channels before playing ? How could I do to really have one sound on left and the other on the right ? I didn't precise but right and left needs to be simultaneous, it's for an auditory test. I tried with StereoPannerNode and pan but it didn't work well, two problems : (1) yes I can have more level on left or right to have a balanced sound but I can't put all the sound in one hear... and (2) it seems that I can't use it to put one sound on the left and one the right because it's acting on the final mix...
// concatenate all sentences and put it in a buffer
    let buffer = context.createBuffer(2, source.length, fs);
    let bufferData = {
      l: buffer.getChannelData(0),
      r: buffer.getChannelData(1)
    };
    bufferData.l.set(source);
    bufferData.r.set(bip);

    // create a source that will be used to play the buffer
    trial.sound = context.createBufferSource();
    trial.sound.buffer = buffer;
    trial.sound.connect(context.destination);

EDIT: The comment of Raymond Toy helped me to find the solution. When I tried his little test code it doesn't work well whereas js said it is (context.destination.channelCount = 2). It makes me remember about something that causes me a lot of problem before : the preinstalled Dolby software (as you can see below, now it's disabled). It's basically a filter (I thought) and... a mixer, I just realised it by trying to disable it. Enable : sound in the two ears, Disable : one sound in each ear... So my problem is solved in part because I need to make it work for anybody, the only solution I have for the moment is to make a video to explain to viewers how to disable it before doing the test...

The dolby software now disabled

Note that I can also let Dobly enabled but quote the box you see in the next image "désactiver les effets sonores" i.e. "disabled sound effects" other solution:


Solution

  • This should have worked. As a simple test of your setup, try the following (untested):

    // context is the AudioContext.
    let s1 = new OscillatorNode(context, {frequency: 440});
    let s2 = new OscillatorNode(context, {frequency: 880});
    let g1 = new GainNode(context);
    let g2 = new GainNode(context);
    let merger = new ChannelMergerNode(context, {numberOfInputs: 2});
    merger.connect(context.destination);
    s1.connect(g1).connect(merger, 0, 0);
    s2.connect(g2).connect(merger, 0, 1);
    s1.start();
    s2.start();
    

    You should hear a 440 Hz tone in one ear and an 880 Hz tone in the other. This, of course, assumes your audio HW supports stereo. Check to see if context.destination.channelCount is actually two.

    If it still sounds funny, try setting g1.gain.value=0 or g2.gain.value=0 (but not both). This should force sound to go to only the left or right. If not, something else is wrong. I tested this at https://hoch.github.io/canopy and it's working as I expected.