Search code examples
swiftscenekitaugmented-realityarkitmetal

Is ARKit 3 People Occlusion restrict to iPhone X and newer?


I started studying the new People Occlusion effect on iOS 13, so I downloaded the sample project and tried to compile on my device.

Running on a iOS 13 iPhone 7 Plus, it shows the following error:

2019-09-11 13:49:41.257236-0300 ARMatteExampleSwift[7298:1369425] Metal GPU Frame Capture Enabled

2019-09-11 13:49:41.257845-0300 ARMatteExampleSwift[7298:1369425] Metal API Validation Enabled

2019-09-11 13:49:41.589383-0300 ARMatteExampleSwift[7298:1369425] * Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: 'This set of frame semantics is not supported on this configuration' * First throw call stack: (0x19712c97c 0x196e550a4 0x1b20783d4 0x100552aac 0x100552e0c 0x19ab2fc08 0x19ab3029c 0x19aa4e24c 0x19aa53dc8 0x19aa4da94 0x19aa4aed4 0x19b16d954 0x19b16cf84 0x19b16def8 0x19b17ed44 0x19b12ed50 0x19b134cec 0x19a9112ec 0x19ada1d48 0x19a911dd4 0x19a91182c 0x19a911c00 0x19a9114bc 0x19a9159d8 0x19acd49ac 0x19adbaf08 0x19a915710 0x19adbae04 0x19a91557c 0x19a78aa8c 0x19a7895f4 0x19a78a7c4 0x19b13306c 0x19acf5390 0x19c1c1994 0x19c1e6960 0x19c1cc0f8 0x19c1e661c 0x100cf2c04 0x100cf6028 0x19c20b540 0x19c20b20c 0x19c20b734 0x1970aa7d0 0x1970aa728 0x1970a9ec0 0x1970a500c 0x1970a48ac 0x1a0eff328 0x19b136f00 0x100555a80 0x196f2f460) libc++abi.dylib: terminating with uncaught exception of type NSException

Trying to debug, I found this on the sample project readme:

Note: To run the app, use an iOS device with A12 chip or later.

Why? Is Metal 2 restrict to A12 Chips?


Solution

  • ARKit 3.0 People Occlusion feature is restricted to devices powered by A12 Bionic (7 nm) and A13 Bionic (7 nm) processors. iPhone X doesn't support People Occlusion because it has A11 CPU (10 nm technology).

    Why is that?

    That's because People Occlusion feature is extremely computationally intensive. To turn this feature on you just need to use a type property allowing to occlude a virtual content depending on its depth:

    static var personSegmentationWithDepth: ARConfiguration.FrameSemantics { get }
    

    It's computationally intensive due to a realtime compositing technique for RGB, Alpha and ZDepth channels of background, 3D model and foreground at 60 fps tracking and 60 fps rendering. So only A12 and A13 chipsets can do it without lags and overheating (they have more power and they are more energy efficient).

    And the same reason for Metal 2 framework:

    The Apple A12 Bionic and A13 Bionic graphics card is the second generation of integrated GPUs that was designed by Apple and not licensed by PowerVR. It can be found in the Apple iPhone Xs, iPhone Xr and iPhone 11 include 4 cores and supports Metal 2.

    Also, you can read THIS POST for additional info.