I have a minimap on the screen and having problems to determine the angle to a target relative to the camera.
Here is some drawing of what I mean with some examples of camera position and direction:
Here's my code so far:
//Anchored position around minimap circle
void CalculateScreenPos(){
Vector3 dir = transform.position - cam.position; //xz distance
dir.y = 0;
angle = Angle360(cam.forward.normalized, dir.normalized, cam.right);
Vector2 desiredPosition = new Vector2(
eX * -Mathf.Cos(angle * Mathf.PI/180f) * dir.z,
eY * Mathf.Sin(angle * Mathf.PI/180f) * dir.x
);
minimapBlip.anchoredPosition = desiredPosition;
}
public static float Angle360(Vector3 from, Vector3 to, Vector3 right)
{
float angle = Vector3.Angle(from, to);
return (Vector3.Angle(right, to) > 90f) ? 360f - angle : angle;
}
But the angle seems not working properly, found out that it ranges from
0° + cam.eulerXAngles.x to 360° - cam.eulerAngles.x
So it works when the cam is never looking to the ground or sky.
How do I get rid of the unwanted added x-eulerAngles by not substracting/adding it again to the angle?
angle -= cam.transform.eulerAngles.x
is a bad choice as when the result of Angle360 is 360, it gets substracted again, leading immediatly to a wrong angle.
Also the circle can be an ellipsoid, that's why I have put eX
and eY
in the desired position that determine the extends of the ellipse.
I solved it:
angle = Mathf.Atan2(heading.x, heading.z) * Mathf.Rad2Deg-Camera.main.transform.eulerAngles.y;
Vector2 desiredPosition = new Vector2(
eX * -Mathf.Cos((angle+90) * Mathf.Deg2Rad),
eY * Mathf.Sin((angle+90) * Mathf.Deg2Rad)
);
Thanks for the help so far and happy coding :D