I have a 3D scene with a movable camera. I have the 3D coordinates for that camera (x, y, z -> Y being the height) and the X and Y rotation (Up/Down, Left/Right).
I want get the coordinates (x1, z1) in the floor where I'm looking at.
Basically if the camera is at (0, 4096, 0) (4096 is the height) and my xRotation is 45º and my yRotation is 0, I will be looking at the point on the floor (4096, 0, 0)
I was trying to program it but i got stuck with the trigonometry. Help me with it.
The following code is what I have right now and not fully working:
float x1, z1, anguloX, anguloY;
anguloX = (90 - Xrotation) / 180 * Pi;
anguloY = (90 - Yrotation) / 180 * Pi;
x1 = Yposition * tan(anguloX) * cos(anguloY);
z1 = Yposition * tan(anguloY) * cos(anguloY);
x1 += Xposition;
z1 += Zposition;
Do not bother yourself with this kind of trigonometry, It is often more practical and understandable to use matrices and transformation to do this kind of stuff. I suggest to try this approach instead which is called Ray Casting:
Now about the first, here is the pseudo code :
XRotMat = CreateRotationMatrixAroundXAxis(verticalCameraAngel);
YRotMat = CreateRotationMatrixAroundYAxis(horizentalCameraAngel);
CameraSightDir = YRotMat*XRotMa*initialCameraDir;
and about the second step:
SightRay.Source = Camera.Position;
SightRay.Direction = CameraSightDir;
Intersection = IntersectRayWithPlane(SightRay , FloorPlane);
IntersectRayWithPlane
is quite a simple procedure, you can read about it here.