I'm trying to create ruler application with ARKit
and SceneKit
. I've decided to create ruler image programmatically depends on measured distance.
Here is an extension that I use to draw ruler:
extension UIImage {
static let dashLineWidth: CGFloat = 2.0
static let dashDistance: CGFloat = 163.0 / 25.4
static let rulerFont: UIFont = .systemFont(ofSize: 15.0, weight: .regular)
static let attributes: [NSAttributedStringKey: Any] = [
NSAttributedStringKey.font: rulerFont,
NSAttributedStringKey.foregroundColor: UIColor.black
]
static func drawRuler(width: CGFloat) -> UIImage? {
let cm = width * 100 // width in centimeters
let size = CGSize(width: dashDistance * cm * 10, height: 50.0)
UIGraphicsBeginImageContextWithOptions(size, false, 0.0)
guard let context = UIGraphicsGetCurrentContext() else { return nil }
let background = UIBezierPath(rect: CGRect(origin: .zero, size: size))
context.addPath(background.cgPath)
context.setFillColor(UIColor.white.cgColor)
context.fillPath()
var i: CGFloat = 0.0
var counter: Int = 0
while i < size.width {
let isLongDash = counter % 10 == 0
let isPartDash = counter % 5 == 0
let dashHeight: CGFloat = size.height * (isLongDash ? 0.25 : isPartDash ? 0.15 : 0.07)
UIColor.black.setFill()
UIRectFill(CGRect(x: i - dashLineWidth / 2, y: 0.0, width: dashLineWidth, height: dashHeight))
if isLongDash && counter != 0 {
let value = "\(counter / 10)"
let valueSize: CGSize = value.size(withAttributes: attributes)
value.draw(at: CGPoint(x: i - dashLineWidth / 2 - valueSize.width / 2, y: dashHeight + 5.0), withAttributes: attributes)
}
i += dashDistance
counter += 1
}
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}
func crop(to width: CGFloat, initialWidth: CGFloat) -> UIImage? {
let rect = CGRect(x: 0, y: 0, width: (width / initialWidth) * size.width * scale, height: size.height * scale)
guard let croppedCGImage: CGImage = cgImage?.cropping(to: rect) else { return nil }
return UIImage(cgImage: croppedCGImage)
}
}
So at first I'm drawing 0.5 meter image only once for better performance and then every time just cropping needed part to display in SCNNode
.
And here what I'm trying in my SCNNode
class:
var ruler: SCNNode = initRuler()
var initialWidth: CGFloat = 0.5
var rulerImage: UIImage? = UIImage.drawRuler(width: initialWidth)
func updateRuler() {
guard let geometry = ruler.geometry as? SCNBox else {
fatalError("Geometry is not SCNBox")
}
let width = geometry.width // in meters
if width > initialWidth - 0.05 {
initialWidth += 0.5
rulerImage = UIImage.drawRuler(width: initialWidth)
}
guard let croppedImage = rulerImage?.crop(to: width, initialWidth: initialWidth) else { return }
let texture = SKTexture(image: croppedImage)
let material = SCNMaterial()
material.diffuse.contents = texture
geometry.materials = [material]
}
Everything works fine to the moment when the size of SCNNode
becomes bigger and the image is bigger too. So around 1.3 meters I've got a crash
validateTextureDimensions:759: failed assertion `MTLTextureDescriptor has width (16501) greater than the maximum allowed size of 16384.'
Any help would be appreciated. I was thinking if I can split an image in parts and then assign in to material. Or is there an another way to do this?
Instead of setting a single 16501px
wide image to one SCNode
representing the whole ruler, it makes much more sense to build up your ruler of hundreds of 1cm
segments, each with a texture that you draw programmatically for a ruler segment with a number.