When I use hexdump
on a file with no options, I get rows of hexadecimal bytes:
cf fa ed fe 07 00 00 01 03 00 00 80 02 00 00 00
When I used hexdump -d
on the same file, that same data is shown in something called two-byte decimal groupings:
64207 65261 00007 00256 00003 32768 00002 00000
So what I'm trying to figure out here is how to convert between these two encodings. cf
and fa
in decimal are 207
and 250
respectively. How do those numbers get combined to make 64207
?
Bonus question: What is the advantage of using these groupings? The octal display uses 16 groupings of three digits, why not use the same thing with the decimal display?
As commented by @georg.
0xfa * 256 + 0xcf == 0xfacf == 64207
The conversion exactly works like this.
So, if you see man hexdump
:
-d, --two-bytes-decimal Two-byte decimal display. Display the input offset in hexadecimal, followed by eight space-separated, five-column, zero-filled, two-byte units of input data, in unsigned decimal, per line.
So, for example:
00000f0 64207 65261 00007 00256 00003 32768 00002 00000
Here, 00000f0
is a hexadecimal offset.
Followed by two-byte units of input data, for eg.: 64207
in decimal (first 16 bits - i.e. two bytes of the file).
The conversion (in your case):
cf fa
----> two-byte unit (the byte ordering depends on your architecture).
fa * 256 + cf = facf
----> appropriately ----> 0xfacf
(re-ording)
And dec of oxfacf
is 64207
.
Bonus question: It is a convention to display octal numbers using three digits (unlike hex and decimal), so it uses a triplet for each byte.