I'm developing in Java (JDK 1.8) and manipulating BitSets. I came accross a strange issue.
I'm instantiating a BitSet of size 160 like:
BitSet example = new BitSet(160);
I want to check the size using the size() method that gives the number of bits in the bitset. In the documentation it is said that the constructor with an int N as parameter is creating a bitset of N bits.
But when I do check the size right after with
example.size()
I obtain the value
192
I do not understand why, does anyone came across this kind of problem ? link to documentation : http://docs.oracle.com/javase/7/docs/api/java/util/BitSet.html
This is because the BitSet
constructor creates a BitSet
"whose initial size is large enough to explicitly represent" bits in the range given by the parameter. So the actual size
will be at least the number you give in the parameter, but not necessarily equal to that number.
The reason it uses 192 in particular is that 192 is a fairly nice binary number: 64 * 3.