I thought the whole point of the Decimal
type was arbitrary precision arithmetic. (Or rather I thought that Decimal supported arbitrary precision as well as supporting base-10 arithmetic.)
However, an example I ran into while looking at somebody's question lead me to believe that it does have a limit. Consider this code:
let string = "728509129536673284379577474947011174006"
if var decimal = Decimal(string: string) {
print("string \n'\(string) as a Decimal = \n'\(String(describing:decimal))'")
} else {
print("error converting '\(string)' to a decimal")
}
That outputs
string
'728509129536673284379577474947011174006 as a Decimal =
'728509129536673284379577474947011174000'
It looks like the last digit gets lost. I tried various other values, and they all start showing zeros at that last digit (It looks like the low-order digit gets truncated to zero when the value contains 39 decimal digits.)
Is that documented somewhere?
Decimal
is the bridged version of NSDecimalNumber
.
An object for representing and performing arithmetic on base-10 numbers that bridges to
Decimal
; ...
Its representation is also documented in NSDecimalNumber
:
An instance can represent any number that can be expressed as
mantissa x 10^exponent
wheremantissa
is a decimal integer up to 38 digits long, andexponent
is an integer from –128 through 127.
That is a bit inaccurate, since the _mantissa
property does not actually store 38 decimal digits, but 128 bits.
In any case, you could say that the "maximum precision" of Decimal
is 38 digits (or sometimes 39 digits depending on the mantissa), in the same sense that Double
has a "maximum precision" of 15 digits (or sometimes 16 digits depending on the mantissa).
When you try to parse a number that is more precise than Decimal
can handle, you can see that the precision is lost:
Decimal(string: "3402823669209384634633746074317682114551") // 40 digits
// results in 3402823669209384634633746074317682114550