I have set up an ARKit collaborative session, using a custom Network Framing Protocol, which successfully exchanges collaborative data between users. ARParticipantAnchor
works as expected and I can successfully present a fully tracked Entity on that Anchor.
I tried to do the same with a ARFaceAnchor
, however, Entities placed on ARFaceAnchor
s don't seem to be exchanged over the network with the other participants. I am also planning to try to do the same with a ARBodyAnchor
.
Has anybody tried to work with ARFaceAnchor
s or ARBodyAnchor
s on a collaborative ARKit/RealityKit session?
Have you found any kind of documentation mentioning potential restrictions on the type of Entities or Anchors exchanged during a collaborative session?
Since all RealityKit configurations are based on ARKit configs, we'll discuss the latter. To exchange data in a multiuser session, at least two things are needed - a point cloud (a.k.a. ARWorldMap – to understand where surrounding world objects are) and an ability to collaborate with other parties.
Although ARBodyTrackingConfig
does have the initialWorldMap instance property (explicitly indicating that we are able to work with its environmental sparse point cloud), this configuration lacks the isCollaborationEnabled flag. ARFaceTrackingConfig
doesn't have both properties.
By default, initialWorldMap = nil
and isCollaborationEnabled = false
.