I'm working on a classification problem (object classification for autonomous vehicle). I use a dataset from KITTI which provide LiDAR and camera data and want to use both of this data to perform the task.
3D LIDAR data is projected onto the coordinate system of the RGB image resulting in a sparse LiDAR image:
Each pixel is encoded using depth (distance to the point : sqrt(X² + Y²), scaling between 0 and 255).
In order to obtain better results for my CNN, I need a dense LiDAR image, anyone know how to do it using Python?
I would like to obtain something like this
I've never worked with point-cloud data/LIDAR before, but as nobody has answered yet, I'll give it my best shot. I'm not sure about inpainting approaches per-say, though I imagine they might not work very well (except for maybe a variational method, which I presume would be quite slow). But if your goal is to project the 3D LIDAR readings (when accompanied by ring ids and laser intensity readings) into a dense 2D matrix (for use in a CNN), the following reference might prove useful. Additionally, in this paper they reference a previous work (Collar Line Segments for Fast Odometry Estimation from Velodyne Point Clouds) which covers the technique of polar binning in more detail, and has C++ code available. Check out the papers, but I'll try and summarize the technique here:
CNN for Very Fast Ground Segmentation in Velodyne LiDAR Data - Describes its preprocessing technique in section III.A (Encoding Sparse 3D Data Into a Dense 2D Matrix).
Finally, looking at the following paper, they introduce some techniques for using the sparse Velodyne readings in a CNN. Maybe see if any of these improve your performance?
Vehicle Detection from 3D Lidar Using Fully Convolutional Network - Describes its preprocessing technique in section III.A (Data Preparation).
Encoding the range data as a 2-channel image
Unequal (Up/Down)sampling
All techniques are implemented with respect to the KITTI dataset/Velodyne LIDAR, so I imagine they could work (perhaps with some modification) for your particular use-case.