Search code examples
iosswiftuiarkitrealitykit

Is it possible to access the mesh created by ARKit?


I am learning AR using SwiftUI and RealityKit, and I started with a very basic implementation that first shows only the camera stream.
Later virtual objects should be placed using a ModelEntity and an AnchorEntity.

App:

import SwiftUI

@main
struct BasicARApp: App {
    var body: some Scene {
        WindowGroup {
            MainView()
        }
    }
}

MainView:

import SwiftUI
import RealityKit

struct MainView : View {
    var body: some View {
        ARViewContainer().edgesIgnoringSafeArea(.all)
    }
}

struct ARViewContainer: UIViewRepresentable {
    
    func makeUIView(context: Context) -> ARView {
        let arView = ARView(frame: .zero)
        return arView
    }
    
    func updateUIView(_ uiView: ARView, context: Context) {}
    
}

The app that I have in mind requires to identify the object closest to the camera.
To find this object in the mesh generated by the ARKit, Apple docs suggest to raycast from a location through the mesh e.g. using

if let result = arView.raycast(from: tapLocation, allowing: .estimatedPlane, alignment: .any).first { // … }  

However I think it would be easier If I had access to the mesh directly, and filter out the nodes closest to the camera.

So my question is: How do I get access to the mesh generated by ARKit?


Solution

  • Here is my code update, partly based on Adarsh Sharma´s answer (+1) and Apple's demo project Visualizing and Interacting with a Reconstructed Scene.
    Essentially, I had to configure the session property of the ARView to create the mesh.
    Then I can set a breakpoint at let meshGeometry = meshAnchor.geometry, and handle the mesh anchor.
    PS: This code works on an iPhone 14 Pro under iOS 16.6, even if this device doesn't have a Lidar scanner.
    EDIT: By now I realized that my iPhone 14 Pro has a Lidar scanner.

    struct ARViewContainer: UIViewRepresentable {
        
        func makeUIView(context: Context) -> ARView {
            let arView = ARView(frame: .zero)
            
            // Configure the ARView to generate a mesh
            arView.environment.sceneUnderstanding.options = []
            arView.debugOptions.insert(.showSceneUnderstanding)
            arView.renderOptions = [.disablePersonOcclusion, .disableDepthOfField, .disableMotionBlur]
            
            // Manually configure what kind of AR session to run since ARView on its own does not turn on mesh classification.
            arView.automaticallyConfigureSession = false
            let configuration = ARWorldTrackingConfiguration()
            configuration.sceneReconstruction = .mesh
            
            configuration.environmentTexturing = .automatic
            arView.session.run(configuration)
    
            arView.session.delegate = context.coordinator
            
            return arView
        }
        
        func updateUIView(_ uiView: ARView, context: Context) {}
        
        func makeCoordinator() -> Coordinator {
            Coordinator(self)
        }
        
        class Coordinator: NSObject, ARSessionDelegate {
            var parent: ARViewContainer
            
            init(_ parent: ARViewContainer) {
                self.parent = parent
            }
            
            func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
                for anchor in anchors {
                    if let meshAnchor = anchor as? ARMeshAnchor {
                        let meshGeometry = meshAnchor.geometry
                    }
                }
            }
        }
        
    }