RealityKit is not using MacOS camera for rendering object in ARView

I would like to develop a desktop app for MacOS that uses computer camera with RealityKit to display 3D cube in front of camera at some distance. I've started a project for MacOS using RealityKit, and immediately noticed that it is not using MacOS camera , not even asking for camera access permission and same project work fine for IOS. On MacOs it showing cube with black background.

Is the same functionality not available on both MacOS and iOS, or am I just doing something wrong?

We are expecting that app will ask permission to access camera and object will get visualise in front of camera.


  • Unlike Unity AR apps, that can run on desktop computers, RealityKit apps for macOS don't have the ARKit part (there are no accel/gyro/faceid/lidar hardware sensors), so no tracking/scene understanding capabilities are included in RealityKit's macOS API. RealityKit desktop apps are considered VR apps.

    Based on the aforementioned, it is not difficult to guess that there is no ARSession object in the RealityKit API for macOS.

    struct ARContainer: NSViewRepresentable {
        let arView = ARView(frame: .zero)
        func makeNSView(context: Context) -> ARView {
            print(arView.session.identifier)              // Error in macOS
            return arView