Search code examples
swiftarkitrealitykitreality-composer

How to preserve behaviors from Reality Composer?


I want to show Reality Composer experience on an image, however I don't want the AR Objects to hide when I lose track of the image, So I created an AnchorEntity and fetched the translation of the ImageAnchor from the didAdd anchors delegate method as follows:

func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
    
    // If we already added the content to render, ignore
    if rootAnchor != nil {
        return
    }
    
    // Make sure we are adding to an image anchor. Assuming only
    // one image anchor in the scene for brevity.
    guard anchors[0] is ARImageAnchor else {
        return
    }
    
    // Create the entity to render, could load from your experience file here
    // this will render at the center of the matched image
    rootAnchor = AnchorEntity(world: [0,0,0])
    
    guard let boxAnchor = boxAnchor else { return }
    rootAnchor!.addChild(boxAnchor.tap!)
    
    arView.scene.addAnchor(rootAnchor!)
}

As you can see here Im fetching the tap entity from the Reality Composer file in order to reinsert it to the Anchor entity.

My problem now is that I have lost the behaviour from my Reality Composer file, this entity had some behaviours like when you tap it flips or proximity to camera. By losing behaviours, Im losing the whole point of Reality Composer.

Now my question is, how can I retrieve the behaviours from reality file? I thought about anchoring the whole experience instead of an entity, so instead of rootAnchor!.addChild(boxAnchor.tap!) it would be rootAnchor!.addChild(boxAnchor) , but then the update delagate method does not update the position of the entities in the scene, here is the code:

    func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
    guard let rootAnchor = rootAnchor else {
        return
    }
    
    // Code is assuming you only have one image anchor for brevity
    guard let imageAnchor = anchors[0] as? ARImageAnchor else {
        return
    }
    
    if !imageAnchor.isTracked {
        return
    }
    rootAnchor.transform = Transform(matrix: imageAnchor.transform)
}

Basically rootAnchor.transform = Transform(matrix: imageAnchor.transform) does not work if the root anchor is the whole scene.

So how can preserve the behaviour while updating the anchor according to the image?

Thanks


Solution

  • The problem is: in RealityKit 2.0 we don't have access to the Reality Composer's behaviors (except notification), and the anchor type – we know that it's AnchorEntity(.image) – you assigned in Reality Composer works the same way as ARKit's image anchor in ARImageTrackingConfig, i.e. it uses local tracking instead of global tracking. In other words, hardly you get such a scenario with current Reality Composer scene.

    Try ARKit and its ARWorldTrackingConfig for image tracking. All the "behaviors" must be recreated from scratch in RealityKit or in SceneKit.