Search code examples
iosavfoundationcropvideo-processing

How to overlay one video on another in iOS?


I am trying to crop an already taken video into a circle in iOS. How might I go about doing this. I know how I would do it with AVCaptureSession but I don't know to pass in an already taken video as an AVCaptureDevice? Is there a way to crop a video into a circle. I want to overlay it on top of another video so it has to have a transparent background as well. Thanks.


Solution

  • I guess you want to produce something like this:

    demo of video overlay with oval crop

    You don't want an AVCaptureSession, because you're not capturing video. You want an AVMutableComposition. You need to read the “Editing” section of the AV Foundation Programming Guide. Here's a summary of what you need to do:

    1. Create the AVAsset objects for your videos and wait for them to load their tracks.

    2. Create an AVMutableComposition.

    3. Add a separate AVMutableCompositionTrack to the composition for each of the input videos. Make sure to assign explicit, different track IDs to each track. If you let the system pick, it will use track ID 1 for each and you won't be able to access both later in the compositor.

    4. Create an AVMutableVideoComposition.

    5. Create an AVMutableVideoCompositionInstruction.

    6. For each input video, create an AVMutableVideoCompositionLayerInstruction and explicitly assign the track IDs you used back in step 3.

    7. Set the AVMutableVideoCompositionInstruction's layerInstructions to the two layer instructions you created in step 6.

    8. Set the AVMutableVideoComposition's instructions to the instruction you created in step 5.

    9. Create a class that implements the AVVideoCompositing protocol. Set the customVideoCompositorClass of the video composition (created in step 4) to this custom class (e.g. videoComposition.customVideoCompositorClass = [CustomVideoCompositor class];).

    10. In your custom compositor, get the input pixel buffers from the AVAsynchronousVideoCompositionRequest and use them to draw the composite frame (containing a background video frame overlaid by a circular chunk of the foreground video frame). You can do this however you want. I did it using Core Graphics because that's easy, but you'll probably want to use OpenGL (or maybe Metal) for efficiency in a production app. Be sure to specify kCVPixelBufferOpenGLESCompatibilityKey if you go with OpenGL.

    11. Create an AVAssetExportSession using your composition from step 1.

    12. Set the session's output URL and file type.

    13. Set the session's videoComposition to the video composition from step 4.

    14. Tell the session to exportAsynchronouslyWithCompletionHandler:. It will probably be slow!

    You can find my test project in this github repository.