Search code examples
swiftaugmented-realityarkitrealitykitarcamera

RealityKit and ARKit – What is AR project looking for when the app starts?


You will understand this question better if you open Xcode, create a new Augmented Reality Project and run that project.

After the project starts running on device, you will see the image from the rear camera, shooting your room.

After 3 or 4 seconds, a cube appears.

My questions are:

  1. what were the app doing before the cube appearance? I mean, I suppose the app were looking for tracking points on the scene, so it could anchor the cube, right?

  2. if this is true, what elements are the app looking for?

  3. Suppose I am not satisfied with the point the cube appeared. Is there any function I can trigger with a tap on the screen, so the tracking can search for new points again near the location I have tapped on the screen?

I know my question is generic, so please, just give me the right direction.


Solution

  • In the default Experience.rcproject the cube has an AnchoringComponent with a horizontal plane. So basically the cube will not display until the ARSession finds any horizontal plane in your scene (for example the floor or a table). Once it finds that the cube will appear.

    If you want instead to create and anchor and set that as the target when catching a tap event, you could perform a raycast. Using the result of a raycast, you can grab the worldTransform and set the cube's AnchoringComponent to that transform:

    Something like this:
    boxAnchor.anchoring = AnchoringComponent(.world(transform: raycastResult.worldTransform))