I am terribly annoyed by the inaccuracy of the intrinsic trig functions in the CLR. It is well know that
Math.Sin(Math.PI)=0.00000000000000012246063538223773
instead of 0. Something similar happens with Math.Cos(Math.PI/2)
.
But when I am doing a long series of calculations that on special cases evaluate to
Math.Sin(Math.PI/2+x)-Math.Cos(x)
and the result is zero for x=0.2, but not zero for x=0.1 (try it). Another issue is when the argument is a large number, the inaccuracy gets proportionally large.
So I wonder if anyone has coded some better representation of the trig functions in C# for sharing with the world. Does the CLR call some standard C math library implementing CORDIC or something similar? link:wikipedia CORDIC
You need to use an arbitrary-precision decimal library. (.Net 4.0 has an arbitrary integer class, but not decimal).
A few popular ones are available: