This is a question concerning cross-platform consistency and determinism of floating point operations (IE yield different results on different CPUs/sysetms)
Which one is more likely to stay cross-platform consistent(pseudo code):
float myFloat = float ( myInteger) / float( 1024 )
or
float myFloat = float ( myInteger ) / float( 1000 )
Platforms are C# and AS3.
.
AS3 versions:
var myFloat:Number = myInteger / 1000 // AS3
var myFloat:Number = myInteger / 1024 // AS3
- Ok I've added AS3 version for clarification, which is equivalent to the 'C pseudo code' above . As you can see in AS3 all calculations, even on integers, are performed as Floats automatically, a cast is not required ( and nor can you avoid it or force the runtime to perform true integer divisions ) Hopefully this explains why im 'casting' everything into Floats: I am not! that just simply what happens in one of the target languages!
The first one is likely the same on both platforms, since there are no representation issues. In particular for small integers (highest 8 bits unused) there is one exact result, and it's very likely that this result will be used.
But I wouldn't rely on it. If you need guaranteed determinism, I recommend implementing the required arithmetic yourself on top of plain integers. For example using a fixed point representation.
The second one is likely to be inconsistent, even when using the same C# code on different hardware or .net versions. See the related question Is floating-point math consistent in C#? Can it be?