Search code examples
c#xnamonogamelibretro

How to convert an array of int16 sound samples to a byte array to use in MonoGame/XNA


I'm writing a libretro frontend in C#/MonoGame, I've managed to get a crude (but working) video blitter but now I'm struggling with sound.

From the API:

/* Renders multiple audio frames in one go.
 *
 * One frame is defined as a sample of left and right channels, interleaved.
 * I.e. int16_t buf[4] = { l, r, l, r }; would be 2 frames.
 * Only one of the audio callbacks must ever be used.
 */
typedef size_t (*retro_audio_sample_batch_t)(const int16_t *data,
      size_t frames);

So, the samples are signed 16 bit integers. I'm trying to use SoundEffect from Stream like this:

        int size = SoundEffect.GetSampleSizeInBytes(TimeSpan.FromMilliseconds((float)1000/(int)_libretro.GetAVInfo().timing.fps), (int)_libretro.GetAVInfo().timing.sample_rate, AudioChannels.Mono);           
        data = _libretro.GetSoundBuffer().data;

        byte[] buffer = new byte[size];
        for (int i = 0; i < size -1 ; i+=2)
        {
            Int16 chunk = Marshal.ReadInt16(data);

            byte b1 = (byte)(chunk);
            byte b2 = (byte)(chunk >> 8);
            buffer[i+1] = b1;
            buffer[i] = b2;

            //move ahead 4 bytes skipping the second sound channel for now
            data = data + (sizeof(byte)*4);
        }

        SoundEffect sound_left = new SoundEffect(buffer, (int)_libretro.GetAVInfo().timing.sample_rate, AudioChannels.Mono);
        sound_left.Play();

And I'm getting sound and the sound pattern is clearly distingishable but it's garbled, do you see anything immediately wrong with my implementation?


Solution

  • This method will convert the samples data to the bytes array. It works with any channels count (tested on mono and stereo).

        public static byte[] GetSamplesWaveData(float[] samples, int samplesCount)
        {
            var pcm = new byte[samplesCount * 2];
            int sampleIndex = 0,
                pcmIndex = 0;
    
            while (sampleIndex < samplesCount)
            {
                var outsample = (short)(samples[sampleIndex] * short.MaxValue);
                pcm[pcmIndex] = (byte)(outsample & 0xff);
                pcm[pcmIndex + 1] = (byte)((outsample >> 8) & 0xff);
    
                sampleIndex++;
                pcmIndex += 2;
            }
    
            return pcm;
        }
    

    Please note that the float[] samples values are expected to be in range [-1;1].