So I have for the longest time created "bitsets" in java.
I have always thought that if I use a long I have 64 bits to play with.
However I have now discovered it's not that simple...
And I suspect it has to with overflow and casting...
long x = 1<< 30;
return x+ "="+ Long.toBinaryString(x);
Renders:
1073741824=1000000000000000000000000000000
Expected.
But:
long x = 1<< 31;
return x+ "="+ Long.toBinaryString(x);
Renders:
-2147483648=1111111111111111111111111111111110000000000000000000000000000000
?????
I assume it has to with how the long equivalent value of the binary shift is being calculated and then rendered as a long. i.e. it is cast to a long and then calculated and this is an overflow I assume...
I expected
10000000000000000000000000000000
and the long value to be some negative number (can't figure out what it is).
There is definitely 64 bits in a long in java... as
Long.toBinaryString(Long.MAX_VALUE);
Renders: 111111111111111111111111111111111111111111111111111111111111111 (64 1's).
How do I use the rest of the long space in my bitset?
This is because 1<< 31
is an int
, equal to Integer.MIN_VALUE
, which you then widen to a long
.
Make the first operand a long
:
1L << 31
So this:
long x = 1L << 31;
return x+ "="+ Long.toBinaryString(x);
2147483648=10000000000000000000000000000000