Search code examples
swiftaugmented-realityarkitrealitykitarquicklook

Disable AR Object occlusion in QLPreviewController


I'm using QLPreviewController to show AR content. With the newer iPhones with LIDAR it seems that object occlusion is enabled by default.

Is there any way to disable object occlusion in the QLVideoController without having to build a custom ARKit view controller? Since my models are quite large (life-size buildings), they seem to disappear or get cut off at the end.


Solution

  • ARQuickLook is a library built for quick and high-quality AR visualization. It adopts RealityKit engine, so all supported here features, like occlusion, anchors, raytraced shadows, physics, DoF, motion blur, HDR, etc, look the same way as they look in RealityKit.

    However, you can't turn on/off these features in QuickLook's API. They are on by default, if supported on your iPhone. In case you want to turn on/off People Occlusion you have to use ARKit/RealityKit frameworks, not QuickLook.

    class ViewController: UIViewController {
        
        @IBOutlet var arView: ARView!
        
        override func viewDidLoad() {
            super.viewDidLoad()
            let box = try! Experience.loadBox()
            arView.scene.anchors.append(box)
        }
        
        override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
            self.switchOcclusion()
        }
        
        fileprivate func switchOcclusion() {
            
            guard let config = arView.session.configuration as?
                                                       ARWorldTrackingConfiguration
            else { return }
            
            guard ARWorldTrackingConfiguration.supportsFrameSemantics(
                                                      .personSegmentationWithDepth)
            else { return }
            
            switch config.frameSemantics {           
                case [.personSegmentationWithDepth]: 
                         config.frameSemantics.remove(.personSegmentationWithDepth)
                default: 
                         config.frameSemantics.insert(.personSegmentationWithDepth)
            }           
            arView.session.run(config)
        }
    }
    

    Pay particular attention that People Occlusion is supported on A12 and later chipsets. And it works if you're running iOS 12 and higher.


    P.S.

    The only QuickLook's customisable object is an object from class ARQuickLookPreviewItem.

    Use ARQuickLookPreviewItem class when you want to control the background, designate which content the share sheet shares, or disable scaling in case it's not appropriate to allow the user to scale a particular model.