Search code examples
javabitset

How does BitSet's set method work with bits shifting to the left?


Java's BitSet class has a method Set that sets a single bit to 1 (=true). The method source code is as follows:

public void set(int bitIndex) {
    if (bitIndex < 0)
        throw new IndexOutOfBoundsException("bitIndex < 0: " + bitIndex);

    int wordIndex = wordIndex(bitIndex);
    expandTo(wordIndex);

    words[wordIndex] |= (1L << bitIndex); // Restores invariants

    checkInvariants();
}

In addition to the checks, the method's core code is: words[wordIndex] |= (1L << bitIndex). I can clearly see in the assignment that the left part is the specific word that holds the relevant bit. However, I don't understand how the right part (the shifting left of the bit index) cause the setting of the requested (and only it) bit to 1. Could you please explain?


Solution

  • 1L << bitIndex produces a long, whose bits are all 0s, except for one of the bits. The position of the "1" bit is determined by bitIndex. For example, if bitIndex is 10, then the 11th least significant bit is 1.

    0000 0000 0000 0000 0000 0000 0000 0000 0000 0000 0000 0000 0000 0100 0000 0000
    

    Because the 1 is shifted to the left 10 times. In general, the (bitIndex mod 64 + 1)th least significant bit is the "1" bit.

    This mask is then bitwise OR'ed with whatever is in words[wordIndex]. Every bit in words[wordIndex] stays the same because they are OR'ed with 0, except for the one place where it is a "1" in the mask. That bit in words[wordIndex] will become "1" no matter what it was originally, due to how OR works.