Trying to use my RealityKit project as the foundation for an on screen app (VR) instead of projecting onto the real-world (AR) out the back camera.
Anyone know how to load a RealityKit project asynchronously with the .nonAR
camera option, so it project in an app instead of leveraging the rear facing camera?
Do I create position information in the Swift code or the Reality Composer project?
Here's how you can asynchronously load .usdz
VR-model with a help of RealityKit's .loadModelAsync()
instance method and Combine's AnyCancellable
type.
import UIKit
import RealityKit
import Combine
class VRViewController: UIViewController {
@IBOutlet var arView: ARView!
var anyCancellable: AnyCancellable? = nil
let anchorEntity = AnchorEntity(world: [0, 0,-2])
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
arView.backgroundColor = .black
arView.cameraMode = .nonAR
anyCancellable = ModelEntity.loadModelAsync(named: "horse").sink(
receiveCompletion: { _ in
self.anyCancellable?.cancel()
},
receiveValue: { [self] (object: Entity) in
if let model = object as? ModelEntity {
self.anchorEntity.addChild(model)
self.arView.scene.anchors.append(self.anchorEntity)
} else {
print("Can't load a model due to some issues")
}
}
)
}
}
However, if you wanna move inside 3D environment, instead of using .nonAR
camera mode use:
arView.environment.background = .color(.black)