I am wondering about this behavior that I see in VBA. I am using the Immediate console to debug my code and got the following result:
Debug.Print(Int(0.00024575*10000000+0.5))
2457
I would expect this result to be 2458
. Does anyone know why this is occurring? As expected, before the Int
function is called, the calculation is in Double precision.
Debug.Print(VarType(0.00024575*10000000+0.5))
5
Any help is greatly appreciated!
If you run the following test:
Sub Test()
Dim d As Double: d = 0.5 + 0.00024575 * 10000000
Debug.Assert d = Int(d)
Debug.Assert 0.5 + 2457.5 = Int(0.5 + 2457.5)
Debug.Assert 0.5 + 0.00024575 * 10000000 = Int(CDbl(0.5 + 0.00024575 * 10000000))
Debug.Assert 0.5 + 0.00024575 * 10000000 = CDbl(0.5 + 0.00024575 * 10000000)
Debug.Assert 0.5 + 0.00024575 * 10000000 = VBA.Int(0.5 + 0.00024575 * 10000000)
Debug.Assert 0.5 + 0.00024575 * 10000000 = Int(0.5 + 0.00024575 * 10000000)
End Sub
You will see that only the last Assert fails. This is probably a compiler bug.
Note that Int
is not the same as VBA.Int
.
As suggested in the comments by @GSerg, it appears that the Int
method (not VBA.Int
) does indeed calculate at compile time and it probably uses a different floating point precision. For example:
Option Explicit
#Const Test = Int(12.5)
Sub TestPrecompiler()
#If Test = 12 Then
Debug.Print "Int method calculates at compile time"
#End If
End Sub