Search code examples
swiftaugmented-realityarkitrealitykit

ARKit – How to use ARImageTrackingConfiguration with ARFaceTrackingConfiguration?


I have built an app that uses image tracking and swaps flat images. I am also using people occlusion (now) so people can get photos in front of those images. I really want this app to have a selfie mode, so people can take their own pictures in front of image swapped areas.

I'm reading the features on ARKit 3.5, but as far as I can tell, the only front-facing camera support is with ARFaceTrackingConfiguration, which doesn't support image tracking. ARImageTrackingConfiguration and ARWorldTrackingConfiguration only use the back camera.

Is there any way to make a selfie mode with people occlusion (and image tracking) using the front-facing camera?


Solution

  • About ARConfigurations

    The answer is NO, you can't use any ARConfiguration except ARFaceTrackingConfiguration for front camera. Although, you can simultaneously use ARFaceTrackingConfiguration on the front camera and ARWorldTrackingConfiguration on the rear camera. This allows users interact with AR content in the rear camera using their face as certain controller.

    Look at this docs page to find out what config to what camera (rear or front) corresponds to.

    Here's a table containing ARKit 5.0 eight tracking configurations:

    ARConfiguration What Camera?
    ARWorldTrackingConfiguration Rear
    ARBodyTrackingConfiguration Rear
    AROrientationTrackingConfiguration Rear
    ARImageTrackingConfiguration Rear
    ARFaceTrackingConfiguration FRONT
    ARObjectScanningConfiguration Rear
    ARPositionalTrackingConfiguration Rear
    ARGeoTrackingConfiguration Rear

    Simultaneous World and Face configs

    To use driven World Tracking depending on driver Face Tracking use the following code:

    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)
        
        guard ARFaceTrackingConfiguration.isSupported,
              ARFaceTrackingConfiguration.supportsWorldTracking
        else {
            fatalError("We can't do face tracking")
        }        
        let config = ARFaceTrackingConfiguration()
        config.isWorldTrackingEnabled = true
        sceneView.session.run(config)
    }
    

    Or you can use Face Tracking as a secondary configuration:

    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)
        
        let config = ARWorldTrackingConfiguration()
        
        if !ARWorldTrackingConfiguration.supportsUserFaceTracking {
            print("NOT SUPPORTED")
        } else {
            config.userFaceTrackingEnabled = true
        }
        sceneView.session.run(config)
    }
    

    Pay attention that both properties are available on iOS 13+.

    var userFaceTrackingEnabled: Bool { get set }
    
    var isWorldTrackingEnabled: Bool { get set }
    

    P.S.

    At the moment (15th March 2024), .userFaceTrackingEnabled = true doesn't work on iPads 2020. Apple officially reports this in the Configure and Start the Session section (read the comments to the code):

    2020 iPads do not support user face-tracking while world tracking.