I know that transform of a 3d model provides scale, rotation and translation info, but is there a way we can calculate the size of a 3d model in visionOS?
In RealityKit, there are 2 ways to find out what the model's size is. The first way can be found here. The second one is more convenient, since you simply tap on the model and the raycast(..)
method returns you the required values. You can use system gestures' raycasting (my code represents this approach) or you can use ARKit's hand tracking raycast.
import SwiftUI
import RealityKit
struct ContentView : View {
@State var sizeOfModel: String = ""
var material = UnlitMaterial()
let building = ModelEntity(mesh: .generateBox(width: 1.0, // 1.0m
height: 3.5, // 3.5m
depth: 1.0)) // 1.0m
init() {
material.color.texture = .init(try! .load(named: "texture"))
building.model?.materials = [material]
}
var body: some View {
RealityView { realityViewContent in
building.components.set(InputTargetComponent())
building.generateCollisionShapes(recursive: false)
let offset = building.visualBounds(relativeTo: nil).extents.y / 2
building.position.y += offset
building.position.z = -5.0
realityViewContent.add(building)
}
.gesture(
SpatialTapGesture()
.targetedToAnyEntity()
.onEnded {
// let location = $0.location
let scene = $0.entity.scene!
guard let castHit = scene.raycast(from: .zero,
to: [0, 0,-10]).first
else { return }
let size = castHit.entity.visualBounds(relativeTo: nil)
let x = size.extents.x
let y = size.extents.y
let z = size.extents.z
sizeOfModel = "Building's size is: \(x)m X \(y)m X \(z)m"
print(sizeOfModel)
// CollisionCastHit can also measure a distance for you
print("distance =", castHit.distance, "m")
}
)
}
}