I simulated a discrete event simulation and the model itself is stochastic (total delay time is different from model run to model run). So everything I want to do is a Monte Carlo Experiment in which I run the model 1000 times and plot the total delay time for every iteration in a histogram.
I have declared a variable called durchlaufzeit
. In the sink of my DES I instantiate this variable as durchlaufzeit = Math.round(time())
, this way I want to get the sum of all delay times for this specific model run.
Then I created a Monte Carlo Experiment with a histogram called "Durchlaufzeit" revering to durchlaufzeit via root.durchlaufzeit
. However, when I run the experiment I always get the following invalid histogram, which shows something like a uniform-probability distribution:
I reckon it could have something to do with when and how I instantiate durchlaufzeit but I cannot find out what. Maybe instantiating durchlaufzeit = Math.round(time())
at the sink is not the right way to get the total delay time?
Ensure that you have "random Seed" selected in the Monte Carlo experiment properties under the Randomness section
A very easy test to ensure if it is to do with the logic or with the way you are recording the stats is to simply assign a random number to the durchlaufzeit
variable and see if the histogram gets plotted correctly. If it does the problem is in your logic, if not then the problem is in the Monte Carlo stat recording side.
I did this very simple model and the histogram populated as expected