Search code examples
swiftaugmented-realityscenekitarkitrealitykit

Scanning Real-World Object and generating 3D Mesh from it


ARKit app allows us to create an ARReferenceObject, and using it, we can reliably recognize the position and orientation of the real-world objects. But also we can save the finished .arobject file.

enter image description here

However, ARReferenceObject contains only the spatial features information needed for ARKit to recognize the real-world object, and is not a displayable 3D reconstruction of that object.

func createReferenceObject(transform: simd_float4x4, 
                              center: simd_float3, 
                              extent: simd_float3, 
                   completionHandler: (ARReferenceObject?, Error?) -> Void)

My question:

Is there a method that allows us to reconstruct digital 3D geometry (low-poly or high-poly) from the .arobject file using Poisson Surface Reconstruction or Photogrammetry?


Solution

  • RealityKit 2.0 | Object Capture API

    Object Capture API, announced at WWDC 2021, provides you with the long-awaited tools for photogrammetry. At the output we get USDZ model with a hi-res texture.

    Read about photogrammetry HERE.

    ARKit | Mesh Reconstruction

    Using iOS device with LiDAR and ARKit 3.5/4.0/5.0 you can easily reconstruct a topological map of surrounding environment. Scene Reconstruction feature starts working immediately after launching a current ARSession.

    Apple LiDAR works within 5 meters range. A scanner can help you improve a quality of ZDepth channel, and such features as People/Real World Objects Occlusion, Motion Tracking, Immediate Physics Contact Body and Raycasting.

    Other awesome peculiarities of LiDAR scanner are:

    • you can use your device in a poorly lit room
    • you can track a pure white walls with no features at all
    • you can detect a planes almost instantaneously

    Consider that a quality of a scanned object when you're using LiDAR isn't as good as you expect. Small details are not scanned. That's because a resolution of an Apple LiDAR isn't high enough.