Search code examples
audiocore-audioaudiounit

Audio Unit RemoteIO Setting interleaved float gives kAudioUnitErr_FormatNotSupported


I am working with Audio Unit RemoteIO's to obtain a low latency audio output. My problem is AFAIK audio unit only accepts several audio formats depending on the hardware. My problem is I have a C++ DSP Sound engine and it works with float interleaved PCM. I do not want to implement a format converter since it can slow things down in the remote IO callback. I tried obtaining a low latency Audio Unit with the following format:

AudioStreamBasicDescription const audioDescription = {
            .mSampleRate        = defaultSampleRate,
            .mFormatID          = kAudioFormatLinearPCM,
            .mFormatFlags       = kAudioFormatFlagIsFloat,
            .mBytesPerPacket    = defaultSampleRate * STEREO_CHANNEL,
            .mFramesPerPacket   = 1,
            .mBytesPerFrame     = STEREO_CHANNEL * sizeof(Float32),
            .mChannelsPerFrame  = STEREO_CHANNEL,
            .mBitsPerChannel    = 8 * sizeof(Float32),
            .mReserved          = 0
        };
        
        status = AudioUnitSetProperty(audioUnit,
                                          kAudioUnitProperty_StreamFormat,
                                          kAudioUnitScope_Input,
                                          kOutputBus,
                                          &audioDescription,
                                          sizeof(audioDescription));

This fails with the error code kAudioUnitErr_FormatNotSupported -10868. If I try to obtain a float PCM NON-interleaved audio stream with the following:

AudioStreamBasicDescription const audioDescription = {
            .mSampleRate        = defaultSampleRate,
            .mFormatID          = kAudioFormatLinearPCM,
            .mFormatFlags       = kAudioFormatFlagIsFloat | kAudioFormatFlagIsPacked | kAudioFormatFlagIsNonInterleaved,
            .mBytesPerPacket    = sizeof(float),
            .mFramesPerPacket   = 1,
            .mBytesPerFrame     = sizeof(float),
            .mChannelsPerFrame  = STEREO_CHANNEL,
            .mBitsPerChannel    = 8 * sizeof(float),
            .mReserved          = 0
        };
        
        status = AudioUnitSetProperty(audioUnit,
                                          kAudioUnitProperty_StreamFormat,
                                          kAudioUnitScope_Input,
                                          kOutputBus,
                                          &audioDescription,
                                          sizeof(audioDescription));

Everything works fine. However I want to obtain an interleaved audio stream for my DSP engine to work without format conversion. Is this possible at all?

PS. waiting for hotpaw2 to guide me :)


Solution

  • Your error is probably due to this line:

    .mBytesPerPacket    = defaultSampleRate * STEREO_CHANNEL,