Search code examples
c++audiogame-engineopenalogg

Common sources of OpenAL distortion?


I'm working on an audio engine for a game I'm making. However, when I play my sound clips, they come out mostly okay but mangled and distorted. I generated a sine wave, which should give a pure tone, but it still has this distortion -- hence I think it's on the OpenAL end.

Thing is, I'm not doing anything fancy with OpenAL.

First, I generate 48000 samples of a sine wave

  #include <math.h>
  #define PI 3.14159265

  float amplitude = .5f;
  float frequency = 440;
  float phase = 0.f;
  float time = 0.f;
  int sampleRate = 48000;
  float dt = 1.0f / sampleRate;

  float sineWave[48000];
  fox_for(sample, sampleRate) { // standard macro for for loop
    float val = amplitude * sin(2 * PI * frequency * time + phase);
    sineWave[sample] = val;
    time += dt;
  }

And everything's fine. A bunch of floats in [-.5, .5], in the pattern of a sine wave. Then I do the standard OpenAL steps: Generate a source and a buffer. Bind the data to the buffer, and the buffer to the source, and then play the sound.

// Open device and context
if (ALCdevice *device = alcOpenDevice(NULL)) {
    if (ALCcontext *context = alcCreateContext(device, NULL)) {
      alcMakeContextCurrent(context);
    } else {
      gameLog.writeStr("ALC context opening failed");
    }
  } else {
    gameLog.writeStr("ALC device opening failed");
  }

  // Generate a source, set its properties to default explicitly
  ALuint source;
  alGenSources(1, &source);

  alSourcef(source, AL_PITCH, 1.0f);    
  alListener3f(AL_POSITION, 0.0f, 0.0f, 0.0f);
  alSource3f(source, AL_POSITION, 0.0f, 0.0f, 0.0f);
  alSourcef(source, AL_GAIN, 0.5f);
  alSource3f(source, AL_POSITION, 0.f, 0.f, 0.f);
  alSource3f(source, AL_VELOCITY, 0.f, 0.f, 0.f);

  // Fill buffers with sine wave data
  ALuint testBuf;
  alGenBuffers(1, &testBuf);
  alBufferData(testBuf, AL_FORMAT_MONO16, &sineWave[0], 48000 * sizeof(float), 48000);

  // Play sound
  alSourcei(source, AL_BUFFER, testBuf);
  alSourcePlay(source);

And I get what is distinctly a note at 440hz, but fuzzy and rather distorted. My first thought was that the format AL_FORMAT_MONO16 may be causing the floats (which are 64 bits) to be interpreted as sequences of 16 bit numbers. But if this were the case, I'd imagine that the sound would be extremely distorted and unrecognizable.

In my game, data is read as raw bytes using libvorbisfile, and the format and frequency are specified by the header data parsed from the .ogg file. Hence, I don't think that the problem is using the wrong format or frequency.

If anyone has any experience with OpenAL, I would very much appreciate if you see a problem or a hint to help debugging. Thanks!


Solution

  • Your audio data is formatted wrong. From the documentation, 16-bit data is signed integral data, from -32768 up to +32767.

    You're still hearing something recognizable as a sine wave because the upper half of your floats is sine-like. You can hear that, even though every other sample is essentially random.