I'm trying to visualize depth data captured by ARSession
and ARCamera
. I see that the depth is good, except at the edges of flat objects (like a monitor), where I would like to have a more sharp transition between the rendered plane and the background.
What I'm observing is the "fraying" of the very edge of objects, where the depth value changes significantly. (Creating the free-floating lines, like in the image below). I would like to remove the free-floating lines.
This makes me ask:
What is the "real" resolution of the ARDepthData (what fraction of the value can be safely discarded as noise)?
How to avoid rounding errors when working with ARDepthData?
open class ARDepthData : NSObject {
/* A pixel buffer that contains per-pixel depth data (in meters). */
unowned(unsafe) open var depthMap: CVPixelBuffer { get }
}
Here's how I'm trying to clip the values, but this distorts the resolution of the rest of the image.
const float2 samplePoint = float2(x, y); //0.0 - 1.0 range
const float2 clippingMultiplier = float2(64, 64);
const float2 clippedTextureCoordinate = (floor(texCoord * clippingMultiplier)) / clippingMultiplier; //attempt to clip fractional components
// What are the appropriate sampling filter parameters to reduce the error like in the screenshot?
constexpr sampler depthSampler(mip_filter::nearest, mag_filter::nearest, min_filter::nearest);
const auto depth = depthMap.sample(depthSampler, clippedTextureCoordinate).r;
//depth is used to calculate position
Here's my attempt to remove fractional components from the depth values:
For front facing, true depth camera, AVDepthData has a maximum depth resolution of w640 h480
.
ARDepthData appears to have similar low resolution:
depth: w256 h192
confidence: w256 h192