Search code examples
delayanylogic

AnyLogic: Total Delay Time in a discrete event simulation


Is there any function to measure the total delay time needed in an iteration of a DES? I want to do a Monte Carlo Experiment with my DES and iterate the DES 1000 times. Every iteration I want to measure the total delay time needed in this iteration and plot it to a histogram. I have already implemented aMonteCarlo Experiment. My idea was to have a variable totalDelayTime and instantiate this variable with the total delay time needed in every iteration. In my monte carlo experiment I want to plot this variable via a histogram. Is there any solution or simple anylogic function to measure/get the total delay time? I tried to call my variable totalDelayTime in the sink and said: totalDelayTime = time(). But when trace this variable via traceln(totalDelayTime)to the console I get the exact same delay time for any iteration. However, when I just write traceln(time()) in the sink, I get other delay times for every iteration.


Solution

  • You can get the total simulation run time by calling time() in the "On destroy" code box of Main. It returns the total time in the model time units.

    If you need it in a special unit, call time(MINUTE) or similar.