Search code examples
algorithmlanguage-agnosticestimationaverage

Calculating the actual average value


I've got a relatively little (~100 values) set of integers: each of them represents how much time (in millisecond) a test I ran lasted.

The trivial algorithm to calculate the average is to sum up all the n values and divide the result by n, but this doesn't take into account that some ridiculously high/low value must be wrong and should get discarded.

What algorithms are available to estimate the actual average value?


Solution

  • As you said you can discard all values that diverge more than a given value from the average and then recompute the average. Another value that can be interesting is the Median, that is the most frequent value.