I'm working on an application which can recognize an image and then put an AR Node (3D model build with Reality Composer) on top of this image. I want to build this with RealityKit
/Reality Composer
(which should also support image recognition), but this does not work.
I've already tested if the model worked on a simple horizontal plane, and it did (both in Xcode and the Reality Composer test environment). But when I select an image as anchoring mode, the model does not appear in the Xcode project, while it does appear in the Reality Composer test environment.
I currently use this code to load the Reality Composer project into Xcode:
let arConfiguration = ARWorldTrackingConfiguration()
arConfiguration.planeDetection = .horizontal
arView.session.run(arConfiguration)
guard let anchor = try? Spinner.loadScene() else { return }
arView.scene.anchors.append(anchor)
The expected output would be that - when pointed to the correct image - the model appears.
I had the same issue using iOS 13 beta. Updating to iOS 13.1 beta did the trick. I can only guess it is something related to the RealityKit on iOS. Please note that updating to iOS 13.1 beta requires also updating XCode 11 beta 7 to support it. Hope it would help you.