I was looking at this tutorial (https://developer.apple.com/documentation/vision/recognizing_objects_in_live_capture) and the problem is that when you rotate the device the view of the camera don't change frame size so you'll have a black space.
Any ideas to solve this problem?
In addition, I made my version of the app in SwiftUI (but the problem persists), so solution using SwiftUI are also appreciated!
Thanks in advance
I'm not sure how in SwiftUI, but here's how in UIKit:
Try setting the videoGravity
:
cameraView.videoPreviewLayer.videoGravity = .resizeAspectFill
Then, this should take care of the orientation. This is based off of this answer.
override func viewWillTransition(to size: CGSize, with coordinator: UIViewControllerTransitionCoordinator) {
super.viewWillTransition(to: size, with: coordinator)
let cameraPreviewTransform = self.cameraView.transform
coordinator.animate { (context) in
let deltaTransform = coordinator.targetTransform
let deltaAngle: CGFloat = atan2(deltaTransform.b, deltaTransform.a)
var currentRotation = atan2(cameraPreviewTransform.b, cameraPreviewTransform.a)
// Adding a small value to the rotation angle forces the animation to occur in a the desired direction, preventing an issue where the view would appear to rotate 2PI radians during a rotation from LandscapeRight -> LandscapeLeft.
currentRotation += -1 * deltaAngle + 0.0001;
self.cameraView.layer.setValue(currentRotation, forKeyPath: "transform.rotation.z")
self.cameraView.layer.frame = self.view.bounds
} completion: { (context) in
let currentTransform : CGAffineTransform = self.cameraView.transform
self.cameraView.transform = currentTransform
}
}