Search code examples
iphoneiosopencvaugmented-reality

Not able to calibrate camera view to 3D Model


I am developing an app which uses LK for tracking and POSIT for estimation. I am successful in getting rotation matrix, projection matrix and able to track perfectly but the problem for me is I am not able to translate 3D object properly. The object is not fitting in to the right place where it has to fit.

Will some one help me regarding this?


Solution

  • Check this links, they may provide you some ideas.

    http://computer-vision-talks.com/2011/11/pose-estimation-problem/

    http://www.morethantechnical.com/2010/11/10/20-lines-ar-in-opencv-wcode/

    Now, you must also check whether the intrinsic camera parameters are correct. Even a small error in estimating the field of view can cause troubles when trying to reconstruct 3D space. And from your details, it seems that the problem are bad fov angles (field of view).

    You can try to measure them, or feed the half or double value to your algorithm.

    There are two conventions for fov: half-angle (from image center to top or left, or from bottom to top, respectively from left to right) Maybe you just mixed them up, using full-angle instead of half, or vice-versa