Search code examples
c#arrays.netbytebit

How to read an int from a byte[] with an offset and a size of bits in C#


I need a function that takes as parameters an offset and a size of bits to read an int value from a byte array.

int GetInt(byte[] data, int bitOffset, int bitSize)

For example, I have the following array of bytes:

66 DC 00 00 6A DC 00 00
66 DC 00 00 58 DC 00 00
54 DC 00 00 50 DC 00 00
4C DC 00 00 00 00 00 00
00 00 00 00 00 00 00 08
F0 FF FF 9F F4 7F 20 9A
91 EB 85 88 3F 6E 00 80
3D 6E 00 80 3B 6E 00 00

Same in bits:

01100110  00111011  00000000  00000000  01010110  00111011  00000000  00000000
01100110  00111011  00000000  00000000  00011010  00111011  00000000  00000000
00101010  00111011  00000000  00000000  00001010  00111011  00000000  00000000
00110010  00111011  00000000  00000000  00000000  00000000  00000000  00000000
00000000  00000000  00000000  00000000  00000000  00000000  00000000  00010000
00001111  11111111  11111111  11111001  00101111  11111110  00000100  01011001
10001001  11010111  10100001  00010001  11111100  01110110  00000000  00000001
10111100  01110110  00000000  00000001  11011100  01110110  00000000  00000000

How do I most efficiently ensure that the following function has these return values?:

var a = GetInt(data, 0, 32); // a = 56422
var b = GetInt(data, 313, 11); // b = 4

enter image description here

EDIT: Here the bytes as an C# Array:

new byte[] { 0x66, 0xDC, 0x00, 0x00, 0x6A, 0xDC, 0x00, 0x00, 0x66, 0xDC, 0x00, 0x00, 0x58, 0xDC, 0x00, 0x00, 0x54, 0xDC, 0x00, 0x00, 0x50, 0xDC, 0x00, 0x00, 0x4C, 0xDC, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x08, 0xF0, 0xFF, 0xFF, 0x9F, 0xF4, 0x7F, 0x20, 0x9A, 0x91, 0xEB, 0x85, 0x88, 0x3F, 0x6E, 0x00, 0x80, 0x3D, 0x6E, 0x00, 0x80, 0x3B, 0x6E, 0x00, 0x00 }

EDIT 2: I have already implemented my own solution as well, with which i could take all the values for this post here. Im just very unhappy with my solution because i dont want to pass the array into a BitArray every time. To read a file, this function is called several hundred thousand times.

public static int GetdInt(this byte[] data, int bitOffset, int bitSize)
{
    var bits = new BitArray(data);

    var output = 0;

    for(var bitIndex = 0; bitIndex < bitSize; bitIndex++)
    {
        var bit = bits.Get(bitOffset + bitIndex) ? 1 : 0;
        output |= bit << bitIndex;
    }

    return output;
}

Solution

  • C# 90 bytes

        int GetInt(byte[] d,int o,int b)=>Enumerable.Range(o,b).Sum(i=>(d[i/8]>>i%8)%2>0?1<<i-o:0);
    

    I've written this in code golfed form to poke fun at the fact that your question reads like code golf, and that you haven't shown any attempt of your own ;)

    Here is a unit test for any future answerers. (OP this might be helpful to you, too):

    [TestMethod]
    public void Test()
    {
        var bytes = new byte[]
        {
            0x66, 0xDC, 0x00, 0x00, 0x6A, 0xDC, 0x00, 0x00,
            0x66, 0xDC, 0x00, 0x00, 0x58, 0xDC, 0x00, 0x00,
            0x54, 0xDC, 0x00, 0x00, 0x50, 0xDC, 0x00, 0x00,
            0x4C, 0xDC, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
            0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x08,
            0xF0, 0xFF, 0xFF, 0x9F, 0xF4, 0x7F, 0x20, 0x9A,
            0x91, 0xEB, 0x85, 0x88, 0x3F, 0x6E, 0x00, 0x80,
            0x3D, 0x6E, 0x00, 0x80, 0x3B, 0x6E, 0x00, 0x00
        };
        RunTest(0, 32, 56422);
        RunTest(313, 11, 4);
    
        void RunTest(int offset, int bitSize, int expected)
        {
            var actual = GetInt(bytes, offset, bitSize);
            Assert.AreEqual(actual, expected);
        }
    }
    

    Edit: Since you showed your own attempt here is a non code-golfed answer:

    //first write a function that gets a bit value from the byte[]
    bool GetBitFromByteArray(byte[] data, int bitNumber)
    {
        //8 bits per byte.
        const int sizeOfByte = 8;
        
        var byteNumber = bitNumber / sizeOfByte;//index within the byte array. Integer division always rounds down
        var bitNumberWithinTheByte = bitNumber % sizeOfByte;//bit index within that byte
    
        //now write a function that gets a bit value from a byte
        return GetBitFromByte(data[byteNumber], bitNumberWithinTheByte);
    }
    
    bool GetBitFromByte(byte byteValue, int bitNumber)
    {
        //bit shift so that the bit in question is in the least significant place
        var shifted = byteValue >> bitNumber;
    
        //mod 2 checks if the least significant bit is 0 or 1
        return shifted % 2 > 0;
    }
    
    int GetInt(byte[] data, int offset, int bitCount)
    {
        //get bit values in order
        var bitValues = new List<bool>(bitCount);
        for (int i = 0; i < bitCount; i++)
        {
            bitValues.Add(GetBitFromByteArray(data, i + offset));
        }
    
        //sum up the bit values as powers of 2
        var intValue = 0;
        for (int i = 0; i < bitCount; i++)
        {
            var bitValue = bitValues[i];
            if (bitValue) intValue += 1 << i;//1<<i is equivalent to 2^i
        }
    
        return intValue;
    }
    

    If you are worried about performance and array allocation, the code golfed answer will actually be better.