I was practicing questions for this topic and I came across this question:
Look at the 40byte dump of an IP packet containing a TCP segment below (in hexadecimal).
45 20 03 c5 78 06 00 00 34 06 ca 1f d1 55 ad 71 c0 a8 01 7e
00 50 9a 03 3e 64 e5 58 df d0 08 b3 80 18 00 de 00 02 00 00
Identify all the fields of the IP and TCP header.
Source: Q1 of http://www.eng.utah.edu/~cs5480/homeworks/hw3_soln.pdf
Now I do have the datagram format layout of IPv4 in front of me. The thing I don't understand is that in the solution it says that header length is 20bytes but according to the format the bits 4-7 correspond to header length which is 0x03c5 = 965 bytes. However, in the solution 965bytes is the total datagram length.
Can someone explain this?
digits != bits.
Assuming zero-based counting, 0x03c5 is hex digits 4-7 in your data.
Bits 4-7 would be the hex 5 appearing earlier. This header value represents the number of 32-bit words in the header, so you have to multiply by 4 to get 20 as the number of bytes.