How can I generate a point cloud from a depth or rgb image?
Example:
I need to generate a point cloud from that depth or rgb image but I couldn't find a code for that.
I managed to find one in 3js but I couldn't export it, can someone help me if I can generate a cloud of points through an image of depth and rgb?
You could take a look at how the PCL library does that, using the OpenNI 2 grabber module: this module is responsible for processing RGB/depth images coming from OpenNI compatible devices (e.g. Kinect). Another example is the depth2cloud from ROS. Let's concentrate the former example.
First, it gets the intrinsic camera parameters. This is usually done offline with a calibration, or you might already know the parameters for your camera.
float constant_x = 1.0f / device_->getDepthFocalLength ();
float constant_y = 1.0f / device_->getDepthFocalLength ();
float centerX = ((float)cloud->width - 1.f) / 2.f;
float centerY = ((float)cloud->height - 1.f) / 2.f;
Then, after a bunch of validity check and creating the appropriately sized buffers, it sets the 3d coordinates for each point:
pt.z = depth_map[depth_idx] * 0.001f;
pt.x = (static_cast<float> (u) - centerX) * pt.z * constant_x;
pt.y = (static_cast<float> (v) - centerY) * pt.z * constant_y;
As you can see from the original code, it iterates through all the pixels of the depth image, then uses the camera parameters to project them into a 3d space.
A similar reasoning applies to the convertToXYZRGBPointCloud
function, which additionally takes care of setting the proper RGB color value on the point.