I'm getting wrong output with atan2 function.
I have 2 variable points i.e. p1=(x1,y1) p2=(x2,y2)
angleFun(x1,y1,x2,y2):
dx=x2-x1
dy=y2-y1
radAngle = math.atan2(dy, dx)
degAngle = math.degrees(radAngle)
if degAngle<0:
degAngle+=360
print (angle)
now this code works fine for 0 and 90 degree line only, for other values it doesnt give accurate value.
for ex:
([0.49609900927256284, 0.5], [0.5039009907274372, 0.5]) gives 00 angle.
([0.5, 0.506935094626555], [0.5, 0.49306490537344505]) gives 270 degrees
but
([0.4972415830032833, 0.5049038524386074], [0.5027584169967166, 0.4950961475613926]) gives 299.357753543 degrees.
whereas it should give 315 or -45 degrees.
Really not able to understand, why is that. any angle other than 0 and 90 gets the wrong value. Strictly python and math library. Struggling with this for quiet a time now. any help will be appreciated. Thank you.
([0.4972415830032833, 0.5049038524386074], [0.5027584169967166, 0.4950961475613926]) gives 299.357753543 degrees.
whereas it should give 315 or -45 degrees.
No, it shouldn't. In that case,
dx == 0.5027584169967166 - 0.4972415830032833 == 0.005516833993433334
dy == 0.4950961475613926 - 0.5049038524386074 == -0.009807704877214773
For +315 / -45 degrees, you need dx > 0
and dy == -dx
. The latter condition is not satisfied by the specified input. Qualitatively, you would expect a result farther from angle 0, exactly as you observe. With dy
approximately equal to -2 * dx
, something near +300 / -60 is the right ballpark. I see no reason to think that your code is computing an incorrect answer for that input.