Search code examples
algorithmminimum-cut

independent times to ensure minimum cut of graph at least one trial succeeds


I just finished with the first module in the algo specialization course in coursera.

There was an exam question that i could not quite understand. I have passed that exam, so there's no point for me to retake it.

Out of curiosity, I want to learn the principles around this question.

The question was posted as such:

Suppose that a randomized algorithm succeeds (e.g., correctly computes the minimum cut of a graph) with probability p (with 0 < p < 1). Let ϵ be a small positive number (less than 1).

How many independent times do you need to run the algorithm to ensure that, with probability at least 1−ϵ, at least one trial succeeds?

The options given were:

log(1−p)/logϵ

log(p)/logϵ

logϵ/log(p)

logϵ/log(1−p)

I made two attempts and both were wrong. My attempts were:

  1. log(1−p)/logϵ
  2. logϵ/log(1−p)

It's not so much I want to know the right answer. I want to learn the principles behind this question and what it's asking for. So that I know how to answer similar questions in future.

I have posted this on the forum, but nobody answered after a month. So I am trying it out here.

NO need to post the answer directly. If you got me to get to aha moment, i will mark it as correct.

Thanks.


Solution

  • How many independent times do you need to run the algorithm to ensure that, with probability at least 1−ϵ, at least one trial succeeds?

    Let's rephrase it a bit:

    What is the smallest number of independent trials such that the probability of all of them failing is less than or equal to ϵ?

    By the definition of independent events, the probability of all of them occurring is the product of their individual probabilities. Since the probability of one trial failing is (1-p), the probability of n trials failing is (1-p)^n.

    This gives us an inequality for n:

    (1-p)^n <= ϵ