I'm writing an application and I'm able to set its throughput (the number of bits per second it sends over the wire) to whatever rate I wish. However, I would like to set it as high as possible, as long as other traffic on the network is not heavily impacted.
The problem is, I don't have a good metric to measure that impact. I thought of the following ones, but neither is really "complete":
Is there any standard metric? Do you have any other ideas on how to measure an application impact on the network?
btw - I have a complete control on the network, and can take whatever measurement that I want in order to compute that metric.
Thanks,
Rouli
Different networks behave in different ways as you exceed their bandwidth. Most of them have a succession of badness along the lines:
If some form of QoS is in use, different packet streams may see these effects independently. E.g., you may be pumping 3x bandwidth on your app connection and see relatively little change in ping time. So you must measure with your application's packets.
(1) and (2) may not occur on a given network. (3) will always occur, no matter what. All three can, unfortunately, also occur even when you're nowhere near the bandwidth limit.