Search code examples
swiftswiftuiaugmented-realityarkitrealitykit

What is the real benefit of using Raycast in ARKit and RealityKit?


What is a raycasting in RealityKit and ARKit for?

And when do I need to use a makeRaycastQuery instance method?

func makeRaycastQuery(from point: CGPoint, 
                 allowing target: ARRaycastQuery.Target, 
                       alignment: ARRaycastQuery.TargetAlignment) -> ARRaycastQuery?

Any help appreciated.


Solution

  • Theory

    Simple Raycasting, the same way as HitTesting, helps to accommodate a virtual 3D point on a real-world surface by projecting an imaginary ray from a screen 2D point onto a detected plane. In Apple official documentation (2019) there was the following definition of a raycasting:

    Raycasting is the preferred method for finding positions on surfaces in the real-world environment, but the hit-testing functions remain present for compatibility. With a Tracked Raycast, ARKit and RealityKit continue to refine the results to increase the position's accuracy of virtual content you placed with a raycast.

    Let's look at the difference between the hitTest and rayCast technologies and figure out why Apple decided to replace hitTesting with RayCasting.

    In ARKit, when applying a hitTest, to obtain a collection (set) of ARHitTestResult objects, the capture of a single ARFrame is preliminarily needed. ARFrame contains anchors with transforms, feature points, world origin, etc. The main advantage of AR hitTest over AR rayCast is that the former provides the distance from the camera to the real-world surface. But hitTest has two disadvantages compared to rayCast – poorer performance and lower accuracy.

    import ARKit
    import RealityKit
    
    let arView = ARView(frame: .zero)
    let center = arView.center
    

    // [ARHitTestResult]
    let hitTest = arView.hitTest(center,  
                                 types: [.featurePoint, .existingPlane]).first
    
    hitTest?.anchor              // optional ARAnchor
    hitTest?.distance            // distance to surface
    hitTest?.type                // result type
    hitTest?.localTransform      // relative transform
    hitTest?.worldTransform      // absolute transform
    

    A simple rayCast, in its turn, checks just once for intersections between a casted ray and real-world surfaces. Result in the form of collection of ARRaycastResult objects must be returned immediately. A simple AR rayCast is a "little brother" of ARTrackedRaycast which is a raycast query that ARKit repeats in succession to give you refined results over time. As you understand, a tracked raycasting technology is computationally intensive.

    // [ARRaycastResult]
    let raycast = arView.raycast(from: center,
                             allowing: .estimatedPlane,
                            alignment: .horizontal).first
        
    raycast?.anchor              // optional ARAnchor
    raycast?.target              // target allowing
    raycast?.targetAlignment     // alignment
    raycast?.worldTransform      // absolute transform
    


    SwiftUI implementation

    This code shows you how to implement a simple raycast on tap in SwiftUI app:

    import SwiftUI
    import RealityKit
    import ARKit
    
    struct ContentView : View {
        let arView = ARView(frame: .zero)
        var model = try! Entity.loadModel(named: "robot")
        
        var body: some View {            
            ARContainer(arView: arView)
                .ignoresSafeArea()
                .onTapGesture { 
                    self.raycasting() 
                }
        }
        
        fileprivate func raycasting() {                    
            guard let query = arView.makeRaycastQuery(from: arView.center,
                                                  allowing: .estimatedPlane,
                                                 alignment: .horizontal)
            else { return }
    
            guard let result = arView.session.raycast(query).first
            else { return }
    
            let raycastAnchor = AnchorEntity(world: result.worldTransform)
            raycastAnchor.addChild(model)
            arView.scene.anchors.append(raycastAnchor)
        }
    }
    

    struct ARContainer : UIViewRepresentable {        
        let arView: ARView
        
        func makeUIView(context: Context) -> ARView { return arView }
        func updateUIView(_ view: ARView, context: Context) { }
    }
    

    UIKit implementation

    This code shows you how to implement a simple raycast on tap in UIKit app:

    import UIKit
    import RealityKit
    import ARKit
    
    class ViewController : UIViewController {   
        @IBOutlet var arView: ARView!
        let model = try! Entity.loadModel(named: "usdzModel")
        
        override func touchesBegan(_ touches: Set<UITouch>, 
                                  with event: UIEvent?) {
            self.raycasting()
        }
    
        fileprivate func raycasting() {             
            guard let query = arView.makeRaycastQuery(from: arView.center,
                                                  allowing: .estimatedPlane,
                                                 alignment: .horizontal)
            else { return }
    
            guard let result = arView.session.raycast(query).first
            else { return }
    
            let raycastAnchor = AnchorEntity(world: result.worldTransform)
            raycastAnchor.addChild(model)
            arView.scene.anchors.append(raycastAnchor)
        }
    }
    

    Read this post to find out how to implement a Convex Raycasting in RealityKit.