Search code examples
c#bitconverter

Converting byte array to hexadecimal value using BitConverter class in c#?


I'm trying to convert a byte array into hexadecimal value using Bitconverter class.

long hexValue = 0X780B13436587;
byte[] byteArray = BitConverter.GetBytes ( hexValue );
string hexResult = BitConverter.ToString ( byteArray );

now if I execute the above code line by line, this is what I see

enter image description here

I thought hexResult string would be same as hexValue (i.e. 780B13436587h) but what I get is different, am I missing something, correct me if I'm wrong.

Thanks!


Solution

  • Endianness.

    BitConverter uses CPU-endianness, which for most people means: little-endian. When humans write numbers, we tend to write big-endian (broadly speaking: you write the thousands, then hundreds, then tens, then the digits). For a CPU, big-endian means that the most-significant byte is first and the least-significant byte is last. However, unless you're using an Itanium, your CPU is probably little-endian, which means that the most-significant byte is last, and the least-significant byte is first. The CPU is implemented such that this doesn't matter unless you are peeking inside raw memory - it will ensure that numeric and binary arithmetic still works the way you expect. However, BitConverter works by peeking inside raw memory - hence you see the reversed data.

    If you want the value in big-endian format, then you'll need to:

    • do it manually in big-endian order
    • check the BitConverter.IsLittleEndian value, and if true:
      • either reverse the input bytes
      • or reverse the output