Search code examples
swiftaugmented-realityarkit

ARKit – How to detect the colour of specific feature point in sceneView?


I would like to get the colour of the detected world object at a specific feature point in the sceneView. For example, I have a feature point detected at (x:10, y:10, z:10).

How do I get the colour of the object/ surface at this position?


Solution

  • At the moment it's not possible to get a colour of a real-world object under feature point using ARKit methods (the same way like you saw in many compositing apps). There's no ARKit method allowing you multiply an Alpha of a feature point by RGB value of corresponding pixel in a video stream.

    enter image description here

    .showFeaturePoints is an extended debug option ARSCNDebugOptions for an ARSCNView. This option just allow you to show detected 3D feature points in the world.

    @available(iOS 11.0, *)
    public static let showFeaturePoints: SCNDebugOptions
    

    But I'm sure that you can try to apply a CIFilter to ARKit camera feed containing feature points.

    Feature points in your scene are yellow, so you can use Chroma Key Effect to extract an Alpha channel. Then you need to multiply this Alpha by RGB from camera. So you'll get color-coded feature points.

    enter image description here

    You can alternatively use a CIDifferenceBlendMode op from Core Image Compositing Operations. You need two sources – one with feature points and another without them. Then you have to modify this result of Difference op and assign it to Alpha channel before multiplication.