I am new to developing IOS Applications and I have just started building some sample Arkit applications from Github.
One thing,which I tried was from this link : https://github.com/ttaulli/MeasureAR
This basically measures distance between two points. When I launched this application in my IPhone,I could see point cloud (collection of 3D Pointsaround the object,which I wanted to measure). What I could not understand is,Why the selected two points for measurement were not from point cloud?
I want also like to know the following things:
1) whats the exact purpose of point cloud for arkit measuring applications?
2)Does the point cloud density depend upon colour of an object(the applications basically run on a live camera).Are there any other factors influencing this point cloud?
I would also be glad,if someone can provide me any other sample Arkit measurement applications,which ask user to pick two points from pointcloud.
Thanks in advance!
The pointCloud
you are referring to is actually part of the ARSCNDebugOptions
which are:
Options for drawing overlay content to aid debugging of AR tracking in a SceneKit view.
Which you can set like so: ARSCNDebugOptions.showFeaturePoints
e.g:
augmentedRealityView.debugOptions = ARSCNDebugOptions.showFeaturePoints
A featurePoint
therefore is defined by Apple as:
A point automatically identified by ARKit as part of a continuous surface, but without a corresponding anchor.
What this means is thatARKit
processes each video frame to extract features of the environment and as you move around, more features are detected and thus the device can better estimate properties like the orientation and the position of physical objects.
As you will have seenfeaturePoints
appear as yellow dots in the ARSCNView
, and poor feature extraction is usually caused by:
(a) Poor Lighting,
(b) A Lack Of Texture,
(c) Erratic Movement Of The Users Device.
One very important thing to realise is thatfeaturePoints
are not consistent as they are rendered from each frame of the session, and as such change frequently with lighting conditions, movement etc.
Basically featurePoints
are there to help you visualise the suitability for placing objects e.g. the more featurePoints the more texture or features there are in the environment.
You don't really need see them per se, as ARKit uses these under the hood, for things like performing an ARSCNHitTest
.
Building on from this point therefore,featurePoints
can be used in conjunction with an ARSCNHitTest
which:
Searches for real-world objects or AR anchors in the captured camera image corresponding to a point in the SceneKit view.
The results of this hitTest can then allow you to place virtual content e.g:
/// Performs An ARHitTest So We Can Place An SCNNode
///
/// - Parameter gesture: UITapGestureRecognizer
@objc func placeVideoNode(_ gesture: UITapGestureRecognizer){
//1. Get The Current Location Of The Tap
let currentTouchLocation = gesture.location(in: self.augmentedRealityView)
//2. Perform An ARHitTest To Search For Any Feature Points
guard let hitTest = self.augmentedRealityView.hitTest(currentTouchLocation, types: .featurePoint ).first else { return }
//3. If We Have Hit A Feature Point Get Its World Transform
let hitTestTransform = hitTest.worldTransform.columns.3
//4. Convert To SCNVector3
let coordinatesToPlaceModel = SCNVector3(hitTestTransform.x, hitTestTransform.y, hitTestTransform.z)
//5. Create An SCNNode & Add It At The Position Retrieved
let sphereNode = SCNNode()
let sphereGeometry = SCNSphere(radius: 0.1)
sphereGeometry.firstMaterial?.diffuse.contents = UIColor.cyan
sphereNode.geometry = sphereGeometry
sphereNode.position = coordinatesToPlaceModel
self.augmentedRealityView.scene.rootNode.addChildNode(sphereNode)
}
You could technically measure the distance between featurePoints, but this would really be pointless, as they are constantly changing, and the purpose of a measuring app is to measure the distance between two or more fixed points.
There are countless tutorials online about both ARKit, and how to make a measuring app.
I hope this helps aid your understanding a little better.
Update:
There is an ARPointCloud which is:
A collection of points in the world coordinate space of the AR session.
And which can be accessed by using the:
ARFrame rawFeaturePoints property to obtain a point cloud representing intermediate results of the scene analysis ARKit uses to perform world tracking.
However, as you have asked, this is basically what you are seeing when you set the debugOptions
.
If you are curious and want to get featurePoints, you can do so by using the currentFrame of the ARSession
e.g:
func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) {
//1. Check Our Frame Is Valid & That We Have Received Our Raw Feature Points
guard let currentFrame = self.augmentedRealitySession.currentFrame,
let featurePointsArray = currentFrame.rawFeaturePoints?.points else { return }
}
An example of this can be seen here: Visualizing Raw Feature Points