Search code examples
iosswiftaugmented-realityarkit

how can use two or three AR session configurations parallel ie ( ARWorldTracking and ARBodyTracking )


(note : My scanning method is to keep scanning device stable and let target body move in circular manner to scan complete body)

I want to scan a human in 3D in which I need depth data which I get from ARWorldTracking configuration to generate PCD and simultaneously I also want human body movements to be tracked using ARBodyTracking so I can align the PCD according to body rotates try to stitch according to it

please suggest me any idea how to do above task


Solution

  • So, what you want is to be able to scan something (a human here) in different poses to make a point cloud from angles that you could not normally reach easily ?

    Sadly you cant have several AR sessions run in parallel. But switching sessions is very fast. My guess is that you would have to separate those steps into a free-move body tracking time and a no-move scanning time.

    Scenario Idea:

    -> Start the tracking process.

    -> Your model can freely move, decide of a pose to adopt.

    -> Once the body position was acquired by tracking, start a scan session. The model has to not move.

    -> Once you're happy with the data from this pose, stop scan session, and make the app come back to tracking.

    -> Either start a new scan after having found a new pose you like, or stop the full process.

    Also, I am not sure if the body tracking + the accuracy of the LiDAR combined will give you a satisfying result if you merge data from different poses. With the scenario I'm thinking about, you can display the merged PCD and you're always able to easily drop data from a specific pose you disliked.