Search code examples
swiftuiavfoundationcore-graphicsmetalcifilter

What is the most efficient way to show a live video preview with a CIFilter applied?


What is the most CPU, GPU, and energy efficient way on iOS to display a live preview of the camera and apply multiple CIFilter's to that live preview?

Right now I have a solution that is basically getting each frame of output, filtering it, setting that output image to an @Published property and then displaying that in a Metal view, but I don't really understand what's going on with Metal, and I'm sure there's a faster way than doing this through an @Published property.

What I have now is working, but I assume it's very inefficient and not taking advantage of Metal.

tl;dr: What's the most efficient way to make a AVCaptureVideoPreviewLayer equivalent but add filters to the live preview as it's happening?


Solution

  • I think this is the most effective way.

    Core Image (in its default config) uses Metal to perform the image filtering operations on the GPU. An MTKView is meant to be used for displaying results of a Metal pipeline, so that fits.

    As for the @Published property: I think it doesn't really matter how the pixel buffers are propagated to your filter chain and ultimately to the view as long as the method doesn't add too much overhead. Using Combine should be totally fine.

    There is, unfortunately, no convenient way for applying CIFilters to the camera feed as there is for video playback and export (using AVVideoComposition).