If I run the following code:
BitSet test = new BitSet(20);
System.out.println("Bitset size is " + test.size());
System.out.println("Bitset length is " + test.length());
I get the output:
Bitset size is 64
Bitset length is 0
Which makes sense now that I look closer at the documentation (the former getting the size including implementation overhead and the latter being the last set bit, with all defaulting to false), but is not what I want.
Since the BitSet I actually use can have varying lengths, I want to be able to get back the number of bits represented (ignoring whether they are set). Is there a way to do that (hopefully with a built-in method)?
I realize I could try flipping all the bits and then doing length, or by tracking the variable I use to instantiate the BitSet, but I feel like both would require several lines of comments with my reasons and I was hoping for something a bit more self-documenting.
A BitSet will automatically expand to a size sufficient to represent the highest bit set in it; the initial size is simply a hint as to how many bits you intend to use. So your question as posed makes no sense.