Search code examples
time-complexitycomplexity-theory

Finding running time of the following code


public static int loop(int n){
  int j = 1;
  int n2 = n;
  for (int i = 0; i < n; i++) {
    n2 *= 5.0/7.0;
      for (int k = 0; k < n2; k++) {
        System.out.println("Hello.");
      }
    }
  return j;
}

Hello all, I'm having some difficulties figuring out the time complexity of the above code.

Here's what I have so far:

When i=0, inner loop is executed: (5/7) * n

When i=1, inner loop is executed: (5/7)^2 * n

...

When i=n, inner loop is executed: (5/7)^(n+1) * n

Summing them up, I get O(n * (5/7)^n). Is this an accurate analysis?


Solution

  • Your conclusion isn't correct; in fact, when 0 < r < 1 the formula n * rn converges to 0 as n tends to infinity, so it's definitely not an upper bound for the running time of an algorithm that does more than 0 things.

    In your algorithm, r = 5/7, but for readability I'll keep writing r in this answer. The formula for summing these terms gives the result like n * r * (rn+1 - 1)/(r - 1). The extra factor of r is because the sequence does not start at 1 * n. We need to be careful simplifying this formula, because for 0 < r < 1 the numerator and denominator of the fraction are both negative, and the dominant term in the numerator is 1, which dominates rn+1 because the latter converges to 0 as n tends to infinity.

    If we rewrite it as n * r * (1 - rn+1)/(1 - r) then it's clearer. Discarding the constant factors of r and 1/(1-r) we get O(n * (1 - rn+1)), and then discarding the dominated term -rn+1 we get O(n * 1), or just O(n).

    Another way of seeing this is that the sequence is bounded above by n * (r + r2 + r3 + ...), where the infinite series converges to the constant r/(1 - r), so it's bounded above by n times a constant.