Search code examples
matlabopencvcomputer-visionstereo-3dmultiple-views

How to convert camera pose (Translation matrix) obtained from the essential matrix to world coordinate system


I have extracted Rotation and Translation matrices from the essential matrix. The translation vector has a scale ambiguity. Therefore, I couldn't define its "true" values.

My steps were as follow:

F=estimateF(matches1,matches2,'RANSAC')
E=K2'*F*K1
[U S V]=svd(E)
s=(S(1,1)+S(2,2))/2
S=diag([s s 0])
E_new=U*S*V'
[U S V]=svd(E_new);
R1=U*W*V'
R2=U*W'*V';
t1=U(:,3);
t2=-t1

My problem is how to define the translation of the second camera from the first one in mm.


Solution

  • Unless you know some more information that ties your points to the real world, it's not possible to recover the absolute scale.

    For example, if the matches where corners of squares of a calibration chessboard of which you know the size in mm, then you would be able to know how far cameras are from each other in mm.