I am trying to work with the Google Project Tango tablet, I currently am sending data from my tablet to Epic's Unreal Engine 4 and trying to visualize it properly. The orientation of the cloud data however is having consistent issues lining up with the orientation of the Tango's pose data.
Currently I am taking the pose from the same frame as the xyzIj data, but the resulting point clouds never line up quite right. I tried doing manual rotation and correction of the cloud data so that it matched up with the pose data, however the solution to rotating one set of xyzIj data does not correct the remaining sets either.
I am currently under the assumption that the poses I am using to match these sets are in error, but I am unsure as to how to find the correct one. I also find that using getPoseAtTime tends to produce similarly flawed results, but that may have something to do with the fact that I don't quite understand it's usage.
Is there a solution to this issue outside of hand correcting each data set that I have failed to realize, or does everyone deal with similar issues?
Lets start with a few basics
1) Have you created, and subsequently loaded an ADF before capturing data ?
2) Are you capturing poses relative to the ADF or the Session sart - the former is more accurate
3) There shouldn't be too much drift in rotation - those sensors are reasonably accurate - what looks like rotational drift is either positional error or using the wrong pose
4) Make sure you have an operating mode that lets you focus on stable data, i.e. getting point sets when the camera is not in motion - that's a good way to work out whether its bad data from Tango or issues with your matrix chain