The following code:
WScript.Echo FormatNumber(0.123, 0)
WScript.Echo FormatNumber(0.123, 1)
WScript.Echo FormatNumber(0.123, 2)
WScript.Echo FormatNumber(0.123, 0, TristateTrue)
WScript.Echo FormatNumber(0.123, 1, TristateTrue)
WScript.Echo FormatNumber(0.123, 2, TristateTrue)
WScript.Echo FormatNumber(0.123, 0, TristateFalse)
WScript.Echo FormatNumber(0.123, 1, TristateFalse)
WScript.Echo FormatNumber(0.123, 2, TristateFalse)
WScript.Echo FormatNumber(0.123, 0, TristateUseDefault)
WScript.Echo FormatNumber(0.123, 1, TristateUseDefault)
WScript.Echo FormatNumber(0.123, 2, TristateUseDefault)
outputs as:
0
0,1
0,12
,1
,12
,1
,12
,1
,12
Can anyone explain me why passing TristateTrue
makes no difference with TristateFalse
- or TristateUseDefault
as well - ?
FYI:
FormatNumber(Expression [,NumDigitsAfterDecimal [,IncludeLeadingDigit [,UseParensForNegativeNumbers [,GroupDigits]]]])
IncludeLeadingDigit
Optional. Tristate constant that indicates whether or not a leading zero is displayed for fractional values. See Settings section for values.
BTW here are my computer's regional settings (I should have my leading zero!!!) on a Windows 10 Pro 64-bit (French):
As requested, my comment as answer:
You need to define these constants in the script:
Const TristateUseDefault = -2
Const TristateTrue = -1
Const TristateFalse = 0
As tip: by starting off your scripts with Option Explicit
, errors about undefined variables will show up.