Am I missing something painfully obvious? Or does just nobody in the world actually use java.util.BitSet?
The following test fails:
@Test
public void testBitSet() throws Exception {
BitSet b = new BitSet();
b.set(0, true);
b.set(1, false);
assertEquals(2, b.length());
}
It's really unclear to me why I don't end up with a BitSet of length 2 and the value 10. I peeked at the source for java.util.BitSet, and on casual inspection it seems to fail to make sufficient distinction between a bit that's been set false and a bit that has never been set to any value...
(Note that explicitly setting the size of the BitSet in the constructor has no effect, e.g.:
BitSet b = new BitSet(2);
People do use BitSet
; however, they use it for something other than what you intend. It's probably best to think of BitSet
as a very compact, memory-efficient form of Set<Integer>
that has the peculiar property that you can't put negative numbers into it.
It's very common with BitSet
s to use them in the pattern of
for (int id = set.nextSetBit(0); id >= 0; id = set.nextSetBit(id + 1)) {
// do stuff to a set index
}
after you do something to fill them up. This is equivalent to iterating over the elements of the Set
.