Search code examples
iosswiftarkitavcapturesessionvisionkit

How to use ARKit and vision kit without camera delay iOS Swift


I am using ARKit for Face landmark position and vision kit framework for the hand pose position. I am successfully able to open the camera when the user clicks the ARKit button and I am able switch to visionKit framework. But the problem it's slow while switching between camera.

    private var cameraFeedSession: AVCaptureSession?

  func setUpFaceDetection() {
        DispatchQueue.main.async {
            self.cameraFeedSession?.stopRunning()
        }
        let configuration = ARFaceTrackingConfiguration()
        sceneView.isHidden = false
        sceneView.session.run(configuration)
    }



func setUpHandPoseCamera() {
        self.index = 0
        alertLabel.text = "Please wait..."
        do {
            if cameraFeedSession == nil {
                cameraView.previewLayer.videoGravity = .resizeAspectFill
                try setupAVSession()
                cameraView.previewLayer.session = cameraFeedSession
            }
            cameraFeedSession?.startRunning()
            if let canAdd = cameraFeedSession?.canAddOutput(cameraOutput), canAdd {
                cameraFeedSession?.addOutput(cameraOutput)
            }
            cameraView.imageLayer.removeFromSuperlayer()
        } catch {
            AppError.display(error, inViewController: self)
        }
    }

func disableFaceDetection() {
    sceneView.isHidden = true
    sceneView.session.pause()
}

Solution

  • I'm pretty sure you can't. Under the hood, ARKit is running its own instance of AVCaptureSession, and you have no direct access to it — nor can more than one AVCaptureSession be running at the same time. There is a hardware activation cost associated with stopping/starting an AVCaptureSession.

    Is there any reason why you can't just use the video output data from the already running ARKit session as your input to Vision inference? You can disable tracking on the ARSession without interruption, and effectively use it as a pure camera by reading the ARFrame.capturedImage property in your ARSession delegate methods or from ARSession.currentFrame