Search code examples
xcodeaudiomidiaudiounitauv3

Audio Unit (AUv3) in macOS only works in selected track in DAW


When I build an Audio Unit extension (instrument/aumu) using the template in XCode (New project → App, then New target → Audio Unit Extension) and then build and run it with either Logic Pro X or Garage Band, the plugin only functions when the track it's inserted on is selected. If any other track is selected, breakpoints in eg. the process or handleMIDIEvent overriden functions never get triggered. (plus the unselected tracks start to output a constant, short period glitch noise if they were actually outputting sound before the selected track changed)

Any idea why this happens? I would suspect a fault in XCode or the DAW's part, but I have seen other macOS AUv3 plugins (a still extremely rare breed, unfortunately) work just fine, so I know it's definitely possible.


Solution

  • After much fiddling, I finally found the problem. (I REALLY wish there was more knowledge widely available online on AUv3...)

    It seems that both Logic Pro X and Garage Band on each render cycle ask the plugin process for blocks of different lengths depending on whether the plugin is in the selected track or not. If the track is selected, the requested block will be the length set in the DAW's settings (I/O Buffer Size), presumably for highest priority rendering? Unselected tracks are asked for 1024 frames (the longest Logic's buffers can go, it seems), regardless of I/O Buffer Size setting.

    1024 frames is longer than the

    AUAudioFrameCount maxFramesToRender = 512;
    

    the Audio Unit Extension template stubs in DSPKernel.hpp, thus making it so that rendering fails only when on an unselected track. (the short period glitch noise I mentioned appears to be whatever values were left in the output buffer since the last playback being re-output once every 1024 frames)

    Setting maxFramesToRender = 1024; fixes that problem.

    And now for a heavily opinionated rant: I can't help but feel this default maxFramesToRender value is setting newbies (like me) up for failure, since 1) it's never mentioned in the official tutorials or documentation (AFAIK) 2) it doesn't play nice with Apple's own DAWs, presumably the most obvious places to test 3) it initially works, but only up until you try to play two tracks at once, at which point you probably already have a lot of code down and are all the more susceptible to confusion.

    But oh well, I guess it is what it is.