Search code examples
algorithmrecursiontime-complexitynon-recursive

Which is more efficient n^2 or n*lgn*lgn?


A problem that can be solved by a non-recursive algorithm in n^2 times. The same problem can be solved using a recursive algorithm in n lg(n) operations to divide the input into two equal pieces and lg(n) operations to combine the two solutions together. Which algorithm do you think is more efficient?

EDIT: Base case: T(n) = 1 if n = 1.

This means that nlgn lgn will be more efficient than n^2. Right?


Solution

  • There's a question of how much additional work your recursive algorithm has to do, compared to "simple" O(n^2) version. For example, it may be a good idea to check for, say, n<32 in your recursive implementation and use O(n^2) sub-algorithm in this case. But for large enough n, O(n*log(n)*log(n)) will eventually be faster than O(n^2).

    A table to demonstrate the difference in growth (log is log base 2):
    n n^2 n*log(n) n*[log(n)]^2 1000*n*[log(n)]^2 32 1024 160 800 800 000 1024 ~10^6 ~10^4 ~10^5 ~10^8 10^4 ~10^8 ~10^5 ~2*10^6 ~2*10^9 10^5 ~10^10 ~2*10^6 ~3*10^7 ~3*10^10 10^6 ~10^12 ~2*10^7 ~4*10^8 ~4*10^11
    So, basically, even if you have 1000 times more operations for each "step" of your recursive algorithm, is still turns out faster when your n exceeds a million.