I am performing a Monte Carlo operation on some Risks, the system works correctly but the standard deviation is completely off. When I simulate this value, the system is 100% accurate.
The input variables of the risk I have is the best case cost value, worst case cost value, most likely cost value, probability of risk occurring and the estimated value(Mean * probability).
My current implementation is this (In Java/Apex):
public static Double calculateStandardDeviation(Decimal max, Decimal min, Decimal mostLikely, Decimal eV, Decimal prob){
Double sum = 0,
probability = prob;
//uses standard SD calculation
sum += (min - eV) * (min - eV);
sum += (max - eV) * (max - eV);
sum += (mostLikely - eV) * (mostLikely - eV);
//if the probability is not 100%, apply it to the calculation
if(prob != 0){
sum *= prob;
}
return Math.sqrt(sum);
}
Further example:
If I have a Risk with the values:(Max = 300, Min = 100, mostLikely = 200, eV = 150, Prob = 75%). If I run this risk through my system, the standard deviation is 26.2. The value I know is correct is 94(although this needs to be /2 to function correctly). How would I get this value?
Any help on a more accurate equation would be greatly appreciated! :)
Given a triangular distribution with min
, max
, and mode
, the mean is given by:
mean = (min + max + mode) / 3
and the variance is given by [source]:
var = (min^2 + max^2 + mode^2 - min*max - min*mode - max*mode) / 18
Therefore the standard deviation is given by [source]:
stdev = sqrt(var)
= sqrt( (min^2 + max^2 + mode^2 - min*max - min*mode - max*mode) / 18 )