I am trying to load a 3d model "cube.usdz" onto the world position of the camera.
I've already looked at this solution: https://stackoverflow.com/a/45089244/13246089, but I have no idea how to implement it and where do I even call the function "session"? Sorry I am very new to this and don't know what I am doing.
Why does my code always go to "why does nothing work"? Why can't I get the position of my camera? Any assistance or suggestions would be greatly appreciated.
import UIKit
import RealityKit
import ARKit
class Load3DModelViewController: UIViewController, ARSessionDelegate {
@IBOutlet weak var arView: ARView!
override func viewDidLoad() {
super.viewDidLoad()
// set up AR configuration
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = .horizontal
arView.session.run(configuration)
load3DModel()
arView.session.delegate = self
}
func load3DModel() {
let modelFileName = "cube.usdz"
guard let modelEntity = try? Entity.load(named: modelFileName) else {
print("Failed to load the 3D model")
return
}
if let pos = arView.session.currentFrame?.camera.transform {
let xpos = pos.columns.3.x
let ypos = pos.columns.3.y
let zpos = pos.columns.3.z
let modelTranslation = SIMD3<Float>(xpos,ypos,zpos - 1)
modelEntity.setPosition(modelTranslation, relativeTo: nil)
arView.scene.addAnchor(modelEntity as! HasAnchoring)
} else {
print("\nwhy does nothing work\n")
}
}
}
The first scenario represents how RealityKit automatically tracks the target (i.e. ARCamera's anchor), so the sphere always follows the ARCamera (with 1 meter offset along Z axis).
import UIKit
import RealityKit
class ViewController: UIViewController {
@IBOutlet var arView: ARView!
override func viewDidLoad() {
super.viewDidLoad()
let sphere = ModelEntity(mesh: .generateSphere(radius: 0.05))
sphere.position.z = -1.0
let cameraAnchor = AnchorEntity(.camera)
cameraAnchor.addChild(sphere)
arView.scene.addAnchor(cameraAnchor)
}
}
This example shows how to create a sphere that will be tethered at the anchor's position with the coordinates of the camera location at the moment when session started + 2 seconds delay.
Take a step back to see the sphere.
import UIKit
import RealityKit
class ViewController: UIViewController {
@IBOutlet var arView: ARView!
override func viewDidLoad() {
super.viewDidLoad()
let sphere = ModelEntity(mesh: .generateSphere(radius: 0.05))
DispatchQueue.main.asyncAfter(deadline: .now() + 2.0) {
let worldAnchor = AnchorEntity()
worldAnchor.transform = self.arView.cameraTransform
worldAnchor.addChild(sphere)
self.arView.scene.addAnchor(worldAnchor)
}
}
}
More info you can find here.