I am a bit puzzled with BitVector32 behaviour. See the test:
[TestClass]
public class ParallelPortDevices {
[TestMethod]
public void BitVector32Test() {
var lVector = new BitVector32(0);
Assert.IsTrue(lVector[0]);
Assert.IsFalse(lVector[1]);
}
}
This passes. I.e. the first bit is set to 1 (according to the test). However if you run the ToString method you will get "BitVector32{00000000000000000000000000000000}"
Anything I am missing?
Thanks in advance!
BitVector32
does not use the indexer syntax for index, it uses it as a bitmask. For example, if you were to do this:
lVector[7] = true;
ToString
would give 0...000111
.
The decompiled code for the getter looks like this:
return ((long) this.data & (long) bit) == (long) (uint) bit;
So what ends up happening is this:
(0 & 0) == 0
Which of course, is true.