Search code examples
swiftaugmented-realityarkitrealitykitreality-composer

Is this RealityKit stuff poorly conceived or am I missing something?


I am trying to play with Augmented Reality using Reality Kit.

I want to have my program do one of the following things, selectable by the user:

  1. Detect horizontal surfaces only.
  2. Detect vertical surfaces only.
  3. Detect both, horizontal and vertical surfaces.
  4. Detect images, like I print a target, attach to an object on the real world and the app detects it.

In order to do that I have, as far as I understand, to adjust 3 things:

ARWorldTrackingConfiguration

doing something like

func initSession() {

    let config = ARWorldTrackingConfiguration()
    config.planeDetection = .vertical
  
    arView.session.delegate = self
    arView.session.run(config)
}

Create scenes inside Experience.rcproject

One for the type of anchoring I need. I have created three "scenes" with the following anchor types: horizontal, vertical and image.

Create an ARCoachingOverlayView

To instruct the user to make the detection work properly.

These are the problems:

  1. ARWorldTrackingConfiguration has only two options for planeDetection: horizontal or vertical.

  2. The scenes inside Experience.rcproject, can only be of 3 kinds: horizontal, vertical or image.

  3. The options for ARCoachingOverlayView.goal are: tracking (that is difficult to figure out without proper documentation), horizontalPlane, verticalPlane and anyPlane.

Questions:

  1. How do I configure ARWorldTrackingConfiguration and ARCoachingOverlayView.goal to make the app detect horizontal only, vertical only, horizontal and vertical and images if they don't have all these four options?

  2. I have 3 scenes inside Experience.rcproject one for horizontal, one for vertical and one for image detection. Is that how to do it?


Solution

  • Let's assume that we've created three scenes in Reality Composer called BoxScene for horizontal plane detection (world tracking), StarScene for vertical plane detection (world tracking) and PrismScene for image detection (image tracking) respectively. In each scene we gave names to our models – there are automatic variables generated from these names – goldenBox, plasticStar and paintedPrism.

    To switch from World Tracking config to Image Tracking config in RealityKit we must use definite AnchorEntity's initializers written inside buttons' @IBActions – .image and .plane.

    Look at the following code to find out how to do what you want.

    import RealityKit
    import UIKit
    
    class ViewController: UIViewController {
    
        @IBOutlet var arView: ARView!
        
        let cubeScene = try! Experience.loadBoxScene()
        let starScene = try! Experience.loadStarScene()
        let prismScene = try! Experience.loadPrismScene()
    
    
    
        // IMAGE TRACKING
        @IBAction func image(_ button: UIButton) {
            
            arView.scene.anchors.removeAll()
            
            let anchor = AnchorEntity(.image(group: "AR Resources", 
                                              name: "image"))
            
            let prism = prismScene.paintedPrism!
            anchor.addChild(prism)
            arView.scene.anchors.append(anchor)
        }
    
        
        // WORLD TRACKING
        @IBAction func verticalAndHorizontal(_ button: UIButton) {
            
            arView.scene.anchors.removeAll()
            
            let trackingAnchor = AnchorEntity(.plane([.vertical, .horizontal],
                                      classification: .any,
                                       minimumBounds: [0.1, 0.1]))
             
            let cube = cubeScene.goldenBox!
            let star = starScene.plasticStar!
            
            if trackingAnchor.anchor?.anchoring.target == .some(.plane([.vertical, 
                                                                        .horizontal], 
                                                        classification: .any, 
                                                         minimumBounds: [0.1, 0.1])) {
                
                let anchor1 = AnchorEntity(.plane(.horizontal,
                                  classification: .any,
                                   minimumBounds: [0.1, 0.1]))
                
                anchor1.addChild(cube)
                arView.scene.anchors.append(anchor1)
            }
            
            if trackingAnchor.anchor?.anchoring.target == .some(.plane([.vertical, 
                                                                        .horizontal], 
                                                        classification: .any, 
                                                         minimumBounds: [0.1, 0.1])) {
                
                let anchor2 = AnchorEntity(.plane(.vertical,
                                  classification: .any,
                                   minimumBounds: [0.1, 0.1]))
    
    
                anchor2.addChild(star)
                arView.scene.anchors.append(anchor2)
            }           
        }
    }
    

    P. S.

    At the moment I have no computer with me, I've written it on iPhone. So I don't know if there are any errors in this code...