According to the docs here, Swift 3/4 Decimal type is a representation in base 10 bridged to NSDecimalNumber. However I'm having precision issues that do not reproduce when using NSDecimalNumber.
let dec24 = Decimal(integerLiteral: 24)
let dec1 = Decimal(integerLiteral: 1)
let decResult = dec1/dec24*dec24
// prints 0.99999999999999999999999999999999999984
let dn24 = NSDecimalNumber(value: 24)
let dn1 = NSDecimalNumber(value: 1)
let dnResult = dn1.dividing(by: dn24).multiplying(by: dn24)
// prints 1
Shouldn't the Decimal struct be accurate, or am I misunderstanding something?
NSDecimalNumber
(and its overlay type Decimal
) can represent
... any number that can be expressed as
mantissa x 10^exponent
wheremantissa
is a decimal integer up to 38 digits long, andexponent
is an integer from –128 through 127.
So decimal fractions (with up to 38 decimal digits) can be represented
exactly, but not arbitrary numbers. In particular 1/24 = 0.416666666...
has infinitely many decimal digits (a repeating decimal) and cannot be
represented exactly as a Decimal
.
Also there is no precision difference between Decimal
and NSDecimalNumber
. That becomes apparent if we print the difference
between the actual result and the "theoretical result":
let dec24 = Decimal(integerLiteral: 24)
let dec1 = Decimal(integerLiteral: 1)
let decResult = dec1/dec24*dec24
print(decResult - dec1)
// -0.00000000000000000000000000000000000016
let dn24 = NSDecimalNumber(value: 24)
let dn1 = NSDecimalNumber(value: 1)
let dnResult = dn1.dividing(by: dn24).multiplying(by: dn24)
print(dnResult.subtracting(dn1))
// -0.00000000000000000000000000000000000016