Search code examples
matlab3dcameramatlab-cvstpoint-clouds

warped/curved point clouds


I am working on sparse reconstruction using a calibrated stereo pair. This is the approach I have taken step by step:

1- I Calibrated my stereo cameras using the Stereo Camera Calibrator app in MATLAB.

2- I Took a pair of stereo images and Undistorted each image.

3- I Detect, extract, and match point features.

4- I Use the triangulate function in MATLAB to get 3D coordinates of the matched points by passing the stereoParametes object into triangulate. The resulting 3D coordinates are with respect to the optical center of camera 1 (Right camera) and it is in millimeters.

The problem is that the point clouds seems to be warped and curved towards the edges of the image. at first it seemed like a barrel distortion of the lenses to me. so I recalibrated the bumblebee XB3 cameras using MATLAB camera calibrator app. but this time I used 3 radial distortion coefficients and also included tangential and skew parameters. but the results are the same. I also tried Caltech's camera calibration toolbox but it had the same results as MATLAB. the radial distortion coefficients are similar in both toolboxes. also another problem is that the Z values in the point cloud are all negative but I am thinking that might be coming from the fact that I am using right camera as camera 1 and left camera as camera 2 as opposed to what MATLAB's coordinate system is in the link attached.

I have attached couple of pictures of 3D point cloud from both sparse and dense 3D reconstruction. I am not interetsed in Dense 3D but just wanted to do it to see if the problem still exist which it does. I believe that means the main problem is with the images and camera calibration rather than algorithms.

Now my questions are:

1- What is the main reason/reasons for having warped/curved 3D point clouds? is it camera calibration only or other steps may introduce error as well? how can I check on that?

2- Can you suggest another camera calibration toolbox besides MATLAB's and Caltech's? maybe one that is more suitable for radial distortions?

Thanks

Images:

enter image description here

enter image description here

Links:

coordinate system

Code:

clear
close all
clc

load('mystereoparams.mat');
I11 = imread('Right.tif');
I22 = imread('Left.tif');
figure, imshowpair(I11, I22, 'montage');
title('Pair of Original Images');

[I1, newOrigin1] = undistortImage(I11,stereoParams.CameraParameters1);
[I2, newOrigin2] = undistortImage(I22,stereoParams.CameraParameters2);
figure, imshowpair(I1, I2, 'montage');
title('Undistorted Images');

% Detect feature points
imagePoints1 = detectSURFFeatures(rgb2gray(I1), 'MetricThreshold', 600);
imagePoints2 = detectSURFFeatures(rgb2gray(I2), 'MetricThreshold', 600);

% Extract feature descriptors
features1 = extractFeatures(rgb2gray(I1), imagePoints1);
features2 = extractFeatures(rgb2gray(I2), imagePoints2);

% Visualize several extracted SURF features
figure;
imshow(I1);
title('1500 Strongest Feature Points from Image1');
hold on;
plot(selectStrongest(imagePoints1, 1500));

indexPairs = matchFeatures(features1, features2, 'MaxRatio', 0.4);
matchedPoints1 = imagePoints1(indexPairs(:, 1));
matchedPoints2 = imagePoints2(indexPairs(:, 2));

% Visualize correspondences
figure;
showMatchedFeatures(I1, I2, matchedPoints1, matchedPoints2,'montage');
title('Original Matched Features from Globe01 and Globe02');

% Transform matched points to the original image's coordinates
matchedPoints1.Location = bsxfun(@plus, matchedPoints1.Location, newOrigin1);
matchedPoints2.Location = bsxfun(@plus, matchedPoints2.Location, newOrigin2);

[Cloud, reprojErrors] = triangulate(matchedPoints1, matchedPoints2, stereoParams);
figure;plot3(Cloud(:,1),Cloud(:,2),Cloud(:,3),'b.');title('Point Cloud before noisy match removal');
xlabel('X'), ylabel('Y'), zlabel('Depth (Z) in mm')

% Eliminate noisy points
meanmean=mean(sqrt(sum(reprojErrors .^ 2, 2)))
standdev=std(sqrt(sum(reprojErrors .^ 2, 2)))
errorDists = max(sqrt(sum(reprojErrors.^2,2)),[],14);

validIdx = errorDists < meanmean+standdev;
tt1=find(Cloud(:,3)>0);
validIdx(tt1)=0;
tt2=find(abs(Cloud(:,3))>1800);
validIdx(tt2)=0;
tt3=find(abs(Cloud(:,3))<1000);
validIdx(tt3)=0;

points3D = Cloud(validIdx, :);

figure;plot3(points3D(:,1),points3D(:,2),points3D(:,3),'b.');title('Point Cloud after noisy match removal');
xlabel('X'), ylabel('Y'), zlabel('Depth (Z) in mm')

validPoints1 = matchedPoints1(validIdx, :);
validPoints2 = matchedPoints2(validIdx, :);

figure;
showMatchedFeatures(I1, I2, validPoints1,validPoints2,'montage');
title('Matched Features After Removing Noisy Matches');

% get the color of each reconstructed point
validPoints1 = round(validPoints1.Location);
numPixels = size(I1, 1) * size(I1, 2);
allColors = reshape(im2double(I1), [numPixels, 3]);
colorIdx = sub2ind([size(I1, 1), size(I1, 2)], validPoints1(:,2), ...
    validPoints1(:, 1));
color = allColors(colorIdx, :);

% add green point representing the origin
points3D(end+1,:) = [0,0,0];
color(end+1,:) = [0,1,0];

% show images
figure('units','normalized','outerposition',[0 0 .5 .5])
subplot(1,2,1);
imshowpair(I1, I2, 'montage');
title('Original Images')

% plot point cloud
hAxes = subplot(1,2,2);
showPointCloud(points3D, color, 'Parent', hAxes, ...
    'VerticalAxisDir', 'down', 'MarkerSize', 40);
xlabel('x-axis (mm)');
ylabel('y-axis (mm)');
zlabel('z-axis (mm)')
title('Reconstructed Point Cloud');

figure, scatter3(points3D(:,1),points3D(:,2),points3D(:,3),50,color,'fill')
xlabel('x-axis (mm)');ylabel('y-axis (mm)');zlabel('z-axis (mm)')
title('Final colored Reconstructed Point Cloud');

Solution

  • Your code looks right. The problem seems to be in the calibration. The fact that you get a warped image with 3 coefficients tells me that you may not have enough data points close to the edges of the image to estimate the distortion accurately. It is hard to see with your images, though. If you take a picture of a scene with many straight edges and undistort it, you would get a better idea.

    So I would recommend taking more images with the checkerboard as close to the edges of the image as you can get. See if that helps.

    Another thing to look at would be estimation errors. In R2014b the Stereo Camera Calibrator app can optionally return the standard error values for each of the estimated parameters. Those can give you confidence intervals and tell you whether you may need more data points. See this example.

    Oh, and also make sure that your calibration images are not saved as jpeg. Please use a lossless format like tiff or png.