I recently replaced a heavy OpenCV with the native ARKit for image detection and tracking. The images that I want to track downloaded from the web service and stored locally. I'm creating the ARImageReference
objects from them using the following code:
guard
let image = UIImage(contentsOfFile: imageLocalPath),
let cgImage = image.cgImage
else {
return nil
}
return ARReferenceImage(cgImage, orientation: .up, physicalWidth: 0.12)
The width is small, because the images are not so big as well, around 180 x 240 pixels each, but the printed ones may be bigger.
The session configured depending on the current iOS version since ARImageTrackingConfiguration
not available for iOS 11:
private lazy var configuration: ARConfiguration = {
if #available(iOS 12.0, *),
ARImageTrackingConfiguration.isSupported {
return ARImageTrackingConfiguration()
}
return ARWorldTrackingConfiguration()
}()
if #available(iOS 12.0, *),
let imagesTrackingConfig = configuration as? ARImageTrackingConfiguration {
imagesTrackingConfig.trackingImages = referenceImages
} else if let worldTrackingConfig = configuration as? ARWorldTrackingConfiguration {
worldTrackingConfig.detectionImages = referenceImages
}
session.run(configuration, options: [.resetTracking, .removeExistingAnchors])
The code above works great for iOS versions 12 and 13 even if I use ARWorldTrackingConfiguration
. Images correctly detected by the ARKit. But when I try to run it on iOS 11.3.1, the app immediately crashes with the following error:
Assert: /BuildRoot/Library/Caches/com.apple.xbs/Sources/AppleCV3D/AppleCV3D-1.13.11/library/VIO/OdometryEngine/src/FrameDownsampleNode/FrameDownsampler.cpp, 62: std::abs(static_cast(aspect_1) - static_cast(src_frame.image.width * output_frame_height_)) < max_slack (lldb)
Is it possible that the dynamic markers creation programmatically is not available for the iOS version below 12.0 or am I doing something wrong? Unfortunately, I wasn't able to find any information regarding the specific versions. Any feedback is appreciated. Thank you.
So after some time, I finally managed to find a source of the issue, so I'm posting it here since it may be useful for someone else with the same problem.
The answer is YES, it's possible to create ARReferenceImage
objects programmatically, and use them on iOS 11.3.1 for image detection.
My problem was in the source images itself, and it looks like the improper scale of the CGImage
was causing the crash. Downloaded from the API images are 180x240. By checking the sizes of the CGImage
which used for the creation of the ARReferenceImage
I realized that they're not scaled properly, so the CGImage
size was the same as the source UIImage
.
I decided to try to re-draw the images using the following:
func getRedrawnCgImage(_ originalImage: UIImage) -> CGImage? {
guard let cgImage = originalImage.cgImage else { return nil }
let size = CGSize(width: cgImage.width, height: cgImage.height)
UIGraphicsBeginImageContextWithOptions(size, false, 0.0)
originalImage.draw(in: CGRect(origin: .zero, size: size))
let redrawnImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return redrawnImage?.cgImage
}
The result CGImage
has the size 540x720, so with x3 scale of the original size as it should be. By using those redrawn images, I got rid of the crash, but this is rather a workaround than the proper solution for now.