Search code examples
big-ocomplexity-theorylogarithm

Asymptotic Complexity of Logarithms and Powers


So, clearly, log(n) is O(n). But, what about (log(n))^2? What about sqrt(n) or log(n)—what bounds what?

There's a family of comparisons like this:

nᵃ   (vs.)   (log(n))ᵇ

I run into these comparisons a lot, and I've never come up with a good way to solve them. Hints for tactics for solving the general case?


[EDIT: I'm not talking about the computational complexity of calculating the values of these functions. I'm talking about the functions themselves. E.g., f(n) = n is an upper bound on g(n) = log(n) because f(n) ≤ c g(n) for c = 1 and n₀ > 0.]


Solution

  • log(n)^a is always O(n^b), for any positive constants a, b.

    Are you looking for a proof? All such problems can be reduced to seeing that log(n) is O(n), by the following trick:

    log(n)^a = O(n^b) is equivalent to: log(n) = O(n^{b/a}), since raising to the 1/a power is an increasing function. This is equivalent to log(m^{a/b}) = O(m), by setting m = n^{b/a}. This is equivalent to log(m) = O(m), since log(m^{a/b}) = (a/b)*log(m).

    You can prove that log(n) = O(n) by induction, focusing on the case where n is a power of 2.