I feel dumb but I can't see the timestep used to print the following graph from this data (it's been retrieved via tcpdump and I'm supposed to do the same kind of plot on my own for various websites):
18:43:39.577369 0 out
18:43:39.577449 0 out
18:43:39.819272 0 in
18:43:39.819300 0 out
18:43:39.819531 194 out
18:43:39.827914 0 out
18:43:39.829722 0 in
18:43:39.829741 0 out
18:43:39.829944 194 out
18:43:40.059952 0 in
18:43:40.061021 1448 in
18:43:40.061050 0 out
18:43:40.061108 1448 in
18:43:40.061124 0 out
18:43:40.061163 1200 in
18:43:40.061176 0 out
18:43:40.064159 0 in
18:43:40.064225 0 out
18:43:40.064864 194 out
18:43:40.069418 1448 in
18:43:40.069436 0 out
18:43:40.070015 859 in
18:43:40.070023 0 out
18:43:40.076474 126 out
18:43:40.081113 0 in
18:43:40.082162 1448 in
18:43:40.082174 0 out
18:43:40.082194 1448 in
18:43:40.082199 0 out
18:43:40.082208 1200 in
18:43:40.082212 0 out
18:43:40.094615 1448 in
18:43:40.094636 0 out
etc
Any help would be greatly appreciated, I really need to know this quickly !
The data has time stamp package size (bytes) and an indication for in or out.
The graph divides time into slots of 10 ms and sums up the bytes sent (out) and received (in) within each time slot. A data point is created at the end of each time slot.
E.g. between 30 and 40 ms packages of sizes 1448, 1448 and 1200 are received accounting for a data point of ca. 4100 at 40 ms in the red graph.