As the title states, lldb reports the value of UInt.max
to be a UInt
of -1
, which seems highly illogical. Considering that let uint: UInt = -1
doesn't even compile, how is this even possible? I don't see any way to have a negative value of UInt at runtime because the initializer will crash if given a negative value. I want to know the actual maximum value of UInt.
The Int
value of -1
and the UInt
value UInt.max
have the same bit representation in memory.
You can see that if you do:
let i = Int(bitPattern: UInt.max) // i == -1
and in the opposite direction:
if UInt(bitPattern: Int(-1)) == UInt.max {
print("same")
}
Output:
same
The debugger is incorrectly displaying UInt.max
as a signed Int
. They have the same bit representation in memory (0xffffffffffffffff
on a 64-bit system such as iPhone 6 and 0xffffffff
on a 32-bit system such as iPhone 5), and the debugger apparently chooses to show that value as an Int
.
You can see the same issue if you do:
print(String(format: "%d", UInt.max)) // prints "-1"
It doesn't mean UInt.max
is -1
, just that both have the same representation in memory.
To see the maximum value of UInt
, do the following in an app or on a Swift Playground:
print(UInt.max)
This will print 18446744073709551615
on a 64-bit system (such as a Macintosh or iPhone 6) and 4294967295
on a 32-bit system (such as an iPhone 5).
In lldb:
(lldb) p String(UInt.max)
(String) $R0 = "18446744073709551615"
(lldb)