I've been using the Measurement object to convert from mostly lengths. But I have a strange issue. If I convert from miles to feet I get almost the right answer.
import Foundation
let heightFeet = Measurement(value: 6, unit: UnitLength.feet) // 6.0ft
let heightInches = heightFeet.converted(to: UnitLength.inches) // 72.0 in
let heightMeters = heightFeet.converted(to: UnitLength.meters) // 1.8288 m
let lengthMiles = Measurement(value: 1, unit: UnitLength.miles) // 1.0 mi
let lengthFeet = lengthMiles.converted(to: UnitLength.feet) // 5279.98687664042 ft
// Should be 5280.0
They all work except the last one lengthFeet. In my playground (Xcode Version 9.2 (9C40b)) it returns 5279.98687664042 ft. I also tested in a regular app build and same results.
Any ideas what is going on?
You can see the definition of UnitLength
here. Every unit of length has a name and a coefficient.
The mile unit has a coefficient of 1609.34
, and the foot unit has a coefficient of 0.3048
. When represented as a Double
(IEEE 754 Double precision floating point number), the closest representations are 1609.3399999999999
and 0.30480000000000002
, respectively.
When you do the conversion 1 * 1609.34 / 0.3048
, you get 5279.9868766404197
rather than the expected 5280
. That's just a consequence of the imprecision of fixed-precision floating point math.
This could be mitigated, if the base unit of length was a mile. This would be incredibly undesirable of course, because most of the world doesn't use this crazy system, but it could be done. Foot could be defined with a coefficient of 5280
, which can be represented precisely by Double
. But now, instead of mile->foot being imprecise, meter->kilometer will be imprecise. You can't win, I'm afraid.