Why does hexdump print 00ff here? I expected it to print ff00 , like it got in stdin, but
$ printf "\xFF\x00" | hexdump
0000000 00ff
0000002
hexdump decided to reverse it? why?
This is because hexdump
is dumping 16-bit WORDS (=2-bytes-hex) and x86 processors stores words in little-endian format (you're probably using this processor).
From Wikipedia, Endianness:
A big-endian system stores the most significant byte of a word at the smallest memory address and the least significant byte at the largest. A little-endian system, in contrast, stores the least-significant byte at the smallest address.
Notice that, when you use hexdump without specifiyng a parameter, the output is similar to -x
.
From hexdump, man page:
If no format strings are specified, the default
display is very similar to the -x output format (the
-x option causes more space to be used between format
units than in the default output).
...
-x, --two-bytes-hex
Two-byte hexadecimal display. Display the input offset in
hexadecimal, followed by eight space-separated, four-column,
zero-filled, two-byte quantities of input data, in
hexadecimal, per line.
If you want to dump single bytes in order, use -C
parameter or specify your custom formatting with -e
.
$ printf "\xFF\x00" | hexdump -C
00000000 ff 00 |?.|
00000002
$ printf "\xFF\x00" | hexdump -e '"%07.7_ax " 8/1 "%02x " "\n"'
0000000 ff 00