I want to be able to monitor audio on headphones before and during the capture of video.
I have an AVCaptureSession
set up to capture video and audio.
My idea is to hook and AVCaptureAudioDataOutput
instance up to the AVCaptureSession
for this and process the CMSampleBufferRefs
with a class implementing the AVCaptureAudioDataOutputSampleBufferDelegate
protocol.
But I am not sure how to route the audio to the headphones from there.
What would be the most straighforward way to do this (highest level frameworks, general approach)?
I ended up implementing this Audio Unit
. The remote i/o audio unit to be precise.
Apple's aurioTouch sample code provides a clear example of how to do this.