Search code examples
c#.netparsingcompiler-construction

Converting Int16 to UByte[] is not working in C# with BitConverter.GetBytes()


I am working on a super simple parser/compiler for an example language, and I am having some problems with number conversion. I have the following code as a test:

Console.WriteLine(BitConverter.GetBytes(0x010D)[0]);
Console.WriteLine(BitConverter.GetBytes(0x010D)[1]);

And in the console it prints:

13
1

I am confused because that means that the array is [13, 1]. I would assume that it should go from left to right like the original number does. Is there a way to fix this or do I just need to always treat it like it goes the other way?

Thanks a lot!

P.S. Apologies if this is dumb, I just can't seem to find anything with my problem, which may well be because this is a user error.


Solution

  • I decided to answer this question because Jon Skeet commented with an appropriate answer.

    The solution to this question is really quite simple, and it was just a quirk of working with bytes and binary that I was not aware of.

    See:
    Endianness Wikipedia Article
    GetBytes Docs

    Endianness essentially is which order the bytes in a number go in. In my case, with .NET, the numbers are little-endian, meaning that the smaller numbers come first, followed by the big numbers. For the question's example, 0x010D would be represented as { 0x0D, 0x01 } in little-endian, as it was. If it were to be represented in big-endian, however, it would be represented as { 0x01, 0x0D }

    Thanks again to Jon Skeet for your helpful comment!