I'm not a native English speaker, apologize if there are some grammar issues(or word misusing) in this description.
Recently I'm studying network protocols such as TCP,UDP,IP,etc..., and using Wireshark to capture the transferring packets. i saw a "wierd thing" that i can't understand.
in IPv4 Protocol design, the "version" field is the first field and the "header length" is the second one. both are 4-bit fields. So when the OS going to transfer a IPv4 packet, i thought the "version" and "header length" will compose the first byte, the "version" will set to the lower 4-bit, the "header length" field to the higher 4-bit.
But Wireshark shows me the opposite. the "version" field is always set to the higher 4-bit, and "header length" to the lower. like 0x45(version 4, 20 bytes header length).
because softwares work perfectly fine, I know there is something I must be misunderstanding, but I don't why.
The wikipedia article explains it: "The fields in the header are packed with the most significant byte first (big endian), and for the diagram and discussion, the most significant bits are considered to come first (MSB 0 bit numbering). The most significant bit is numbered 0, so the version field is actually found in the four most significant bits of the first byte, for example." https://en.wikipedia.org/wiki/IPv4