Search code examples
algorithmperformancematlabtimesampling

How to design a randomized algorithm to run in a fixed time?


Does anybody know how to best implement a randomized algorithm (Monte Carlo simulation to approximate a value) to run in a fixed time (5 minutes)? i.e. the algorithm will continue to take new samples until the execute time is around 5 minutes?

This algorithm could help to compare different randomized methods in a fixed time.

Thank you.


Solution

  • Your question is a bit vague but there are a couple possible answers. If you are simply sampling independently over and over (so sampling is typically very fast, compared to how much total time you have), then all you need is access to a system timer in your language so that you can detect whenever your running time is essentially equal to your maximum allowed running time, then halt and give your estimate.

    However there are also randomized algorithms that have EXPECTED running time, and they may or not finish in the maximum allowed time. If that is what you're dealing with, then the best approach is to understand the distribution on running time according to whatever randomness assumptions, and then set up a strategy where you run an algorithm for a specified amount of time and then halt if it hasn't finished, and then restart with new randomness. For example, if the probability of finishing within N steps is 0.9 and the probability of finishing within 2N steps is 0.01, otherwise you finish in more than 2N steps, then your best strategy if you only have 2N steps available is to restart after N steps if you haven't finished. This maximizes the probability of finishing within 2N steps.