When I set the usesSignificantDigits
property to true
, the NSNumberFormatter
won't obey the maximumFractionDigits
. I think this also applies to minimumFractionDigits
.
For example:
let formatter: NSNumberFormatter = NSNumberFormatter()
formatter.locale = NSLocale.currentLocale()
formatter.maximumFractionDigits = 2
formatter.allowsFloats = true
formatter.numberStyle = NSNumberFormatterStyle.DecimalStyle
formatter.usesSignificantDigits = true
let result = formatter.stringFromNumber(1.2345)
The above code returns "1.2345"
as result. The intended result is "1.23"
. When I set the usesSignificantDigits
to false
, it works.
This is contradicting the statement in the developer reference for usesSignificantDigits
:
Determines whether the receiver uses minimum and maximum significant digits properties.
What am I doing wrong?
The property usesSignificantDigits
controls the use of minimumSignificantDigits
and maximumSignificantDigits
, not minimumFractionDigits
and maximumFractionDigits
. You did not set maximumSignificantDigits
in your code so it will use its default value (which seems to be 6).
Consider this enhancement of your original code:
let formatter: NSNumberFormatter = NSNumberFormatter()
formatter.locale = NSLocale.currentLocale()
formatter.maximumFractionDigits = 2
formatter.allowsFloats = true
formatter.numberStyle = NSNumberFormatterStyle.DecimalStyle
// Makes formatter use maximumSignificantDigits
formatter.usesSignificantDigits = true
println(formatter.maximumSignificantDigits) // Prints 6
// Will use 6 significant digits, hence "1.2345"
let result = formatter.stringFromNumber(1.2345)
If you don't want to use the value of maximumSignificantDigits
, you should not set usesSignificantDigits
to true
. This behavior matches the description in the official documentation.