Up and on I meet the need to see the binary/hex representation of some data (tcp packets, files, binary serialization of some data structures,...) and it seems I always stumble accross the same issues with endianess and byte order every time I write the string conversion.
There are already other questions about this topic but I somehow seem to have some problem with my specific implementation.
I try to write myself a few functions to convert any byte/byte array into a binary string (in blocks of 8 since the included functions give only the "bits used") but the problem is I get different results. And I don't know where I have to take care about the endianess of the underlaying system.
class Program
{
static void Main(string[] args)
{
//ushort x = 13;
ushort x = 3328;
string res = ToBinaryString(x);
string res2 = ToBinaryString(BitConverter.GetBytes(x));
Console.WriteLine("res: " + res + "\r\n" + "res2: " + res2);
}
public static string ToBinaryString(byte v)
{
return Convert.ToString(v, 2).PadLeft(8, '0');
}
public static string ToBinaryString(byte[] v)
{
string[] strArr = new string[v.Length];
for(int i = 0; i < v.Length; i++)
{
strArr[i] = ToBinaryString(v[i]);
}
return string.Join("", strArr);
}
public static string ToBinaryString(ushort v)
{
return Convert.ToString(v, 2).PadLeft(16, '0');
}
}
}
The unexpected different output is:
res: 0000110100000000
res2: 0000000000001101
So I am a bit confused because so far I thought those two approaches would return the same (correct) result and not a different one.
Noteworthy: I am on a windows dev machine.
Okay.
The code below is related to a little-endian machine. I didn't check it against big-endian. According to Microsoft BitConverter.GetBytes writes the least significant byte of a number into the first element of byteArray on little-endian machine. Then it writes the next more significant byte to the second element of byteArray, and so on.
The order of elements of byte array resulting from BitConverter.GetBytes depends on machine endianness.
So this is what happens in your little-endian machine.
3328 is D00 in hex view. 00 is the least significant byte, D is the most significant byte. Let's see what happens when you transmit the whole number to BitConverter.GetBytes(x).
It converts your number to the array with the first member being 00 and the second one being D. (v[0] = 0; v[1] = D)
Let's take a look at what happens in
ToBinaryString method with byte array
string[] strArr = new string[v.Length]
which turns out to be strArr = new sting [2]
then
for(int i = 0; i < v.Length; i++)
truns out to be for(int i = 0; i < 2; i++)
At the first pass, you get
strArr[0] = ToBinaryString(v[0]);
=> strArr[0] = "00000000"// since you deal with a pure zero value.
At the second pass, you get
strArr[1] = ToBinaryString(v[1]);
=>
Convert.ToString(D, 2)
=> 1101
1101.PadLeft(8, '0')
=>
00001101
Now let's take a look at
string.Join("", strArr)
That's it
v[0] + v[1] = "00000000" + "00001101"
i.e. It returns 0000000000001101
Try
public static string ToBinaryString(byte[] v)
{
string[] strArr = new string[v.Length];
for(int i = v.Length, j = 0; i > 0; i--, j++)
{
strArr[i-1] = ToBinaryString(v[j]);
}
return string.Join("", strArr);
}
res: 0000110100000000 res2: 0000110100000000
Regards.