Normally, polar coordinates go from 0 to π to 2π (just before 2π really, as it equals 0 again). However, when using the JavaScript atan2()
function, I'm getting a different, weird range:
Cartesian X | Cartesian Y | Theta (θ) =========================================================== 1 | 0 | 0 (0 × π) 1 | 1 | 0.7853981633974483 (0.25 × π) 0 | 1 | 1.5707963267948966 (0.5 × π) -1 | 1 | 2.356194490192345 (0.75 × π) -1 | 0 | 3.141592653589793 (1 × π) -1 | -1 | -2.356194490192345 (-0.75 × π) 0 | -1 | -1.5707963267948966 (-0.5 × π) 1 | -1 | -0.7853981633974483 (-0.25 × π)
As you can see, after it reaches π (180°), it jumps down to –π (–180°), and proceeds back up to 0. How can I get it to use the range {0, ..., 2π} instead of {–π, ..., π}? I've been trying to think of every calculation to "fix" the values, but I would also like to know why JavaScript chooses this range instead of the typical polar range. Thanks!
It's pretty standard for atan2
to return angles in that range; for instance, that's what the atan2
in the C standard library does.
If you want 0..2pi instead of -pi..pi, test whether the result is negative and add 2pi if it is.