Search code examples
macoscocoaavfoundationcore-imagecifilter

Applying CIFilter on the GPU Cocoa


Apple docs give this example for applying a CIFilter to an AVAsset:

let filter = CIFilter(name: "CIGaussianBlur")!
let composition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in

   // Clamp to avoid blurring transparent pixels at the image edges
   let source = request.sourceImage.clampingToExtent()
   filter.setValue(source, forKey: kCIInputImageKey)

   // Vary filter parameters based on video timing
   let seconds = CMTimeGetSeconds(request.compositionTime)
   filter.setValue(seconds * 10.0, forKey: kCIInputRadiusKey)

   // Crop the blurred output to the bounds of the original image
  let output = filter.outputImage!.cropping(to: request.sourceImage.extent)

   // Provide the filter output to the composition
   request.finish(with: output, context: nil)
})

This works great on some videos (it appears to be much more performant with those using AAC codec) and on others CPU usage shoots up and the video never finishes processing. Is there a way to move this onto the GPU to speed things up/not tie up so much of the CPU? I saw this question for iOS but CIContext contextWithEAGLContext: is not available on OS X. I'm new to AVFoundation/video processing, is there an equivalent on OS X?

Note: I'm not looking to do this in real time, I simply want to apply the filter and export the file to the filesystem using the GPU.


Solution

  • macOS instead has contextWithCGLContext for OpenGL:

    + (CIContext *)contextWithCGLContext:(CGLContextObj)cglctx
                             pixelFormat:(nullable CGLPixelFormatObj)pixelFormat
                              colorSpace:(nullable CGColorSpaceRef)colorSpace
                                 options:(nullable NSDictionary<NSString*,id> *)options;
    

    or contextWithMTLDevice: for Metal if you prefer that:

    + (CIContext *)contextWithMTLDevice:(id<MTLDevice>)device;