Like in many languages, Swift's division operator defaults to integer division, so:
let n = 1 / 2
print(n) // 0
If you want floating point division, you have to do
let n1 = 1.0 / 2
let n2 = 1 / 2.0
let n3 = Double(1) / 2
let n4 = 1 / Double(2)
print(n1) // 0.5
print(n2) // 0.5
print(n3) // 0.5
print(n4) // 0.5
Again, like most other languages, you can't cast the whole operation:
let n5 = Double(1 / 2)
print(n5) // 0.0
Which happens because swift performs the integer division of 1 and 2 (1 / 2
) and gets 0, which it then tries to cast to a Double
, effectively giving you 0.0.
I am curious as to why the following works:
let n6 = (1 / 2) as Double
print(n6) // 0.5
I feel like this should produce the same results as Double(1 / 2)
. Why doesn't it?
1
and 2
are literals. They have no type unless you give them a type from context.
let n6 = (1 / 2) as Double
is essentially the same as
let n6: Double = 1 / 2
that means, you tell the compiler that the result is a Double
. That means the compiler searches for operator /
with a Double
result, and that means it will find the operator /
on two Double
operands and therefore considers both literals as of type Double
.
On the other hand,
let n5 = Double(1 / 2)
is a cast (or better said, initialization of a Double
). That means the expression 1 / 2
gets evaluated first and then converted to Double
.