Is it possible to capture the output of the AVPlayer
using AVCaptureSession
? I believe it's possible but can't figure out how to go about using the AVPlayer
as an input.
You cannot plug an AVPlayer
into an AVCaptureSession
, although you can get access to the player's video and audio in the form of CVPixelBuffer
s and AudioBufferList
s.
This is achieved via two APIs: AVPlayerItemVideoOutput
for video and MTAudioProcessingTap
for audio.
Despite being a c-api, MTAudioProcessingTap
is easier to integrate as just like AVCaptureSession
, it pushes you samples via a callback, while with AVPlayerItemVideoOutput
you pull frames for a given time.
For this reason, if you want a AVCaptureSession
-like experience (real-time, push), you should probably let the audio tap drive your frame-pulling.
There is some AVPlayerItemVideoOutput
sample code in objective-c here and in swift here and an example of using an MTAudioProcessingTap
in swift here.