I used PerfMon on Windows XP for checking network load of an application that I have written.
In the below example you see five columns:
Date Time, Bandwidth, [x] Bytes per seconds sent, [x] Bytes per second received, [x] Total Bytes per second
[x] == The network interface that I checked the load against
Here's the data.
02/18/2014 15:30:50.894,"1000000000","922.92007218169454","826.92838536756381","1749.8484575492582"
02/18/2014 15:30:51.894,"1000000000","994.06970480770792","774.05427718427154","1768.1239819919795"
02/18/2014 15:30:52.894,"1000000000","1446.0226222234514","1319.0206353476713","2765.0432575711229"
02/18/2014 15:30:53.894,"1000000000","2652.0592714274339","1207.0269760983833","3859.0862475258173"
Date
, Time
and bandwidth (10^9 bit = 1Gbit (lan connection))
are obviously correct.
The other 3 columns are hard to interpret! It says the unit is bytes per second
for each but how can the system resolve 14 respectively 13 digits after the decimal dot if these were really bytes?
What is 0.0000000000000001 byte
?
Indeed the values are plausible until reaching the dot.
The timer's resolution is higher than shown. You might send 923076 bytes in 100003 microseconds, so the trace shows 100 milliseconds and ignores the microseconds in the time column, but calculates 923076/100003 for the bytes per seconds column. Note i made up the numbers, doesn't make much sense to find a pair that gives your 922.9200... exactly.