Search code examples
algorithmbig-oasymptotic-complexity

Is O(n^3) really more efficient than O(2^n)?


My algorithm class's homework claims that O(n3) is more efficient than O(2n).

When I put these functions into a graphing calculator, f(x)=2x appears to be consistently more efficient for very large n (starting from around n = 982).

Considering that for a function f(n) = O(g(n)), it must be smaller for all n greater than some n0, wouldn't this mean that from n = 982 we could say that O(2n) is more efficient?


Solution

  • 2^n grows extremely faster than n^3. Maybe you have input wrong values into your calculator or something like that. Also note that more efficient means less time which means a lower value on the y-axis.

    Let me show you some correct plots for those functions (using Wolfram Alpha):

    early course of 2^n and n^3

    First 2^n is smaller (but just for a tiny range), after that you can see how 2^n grows beyond it.

    After this intersection the situation never changes again and 2^n remains extremely greater than n^3. That also holds for the range you analysed, so > 982, like seen in the next plot (the plot for n^3 is near the x-axis):

    2^n extremely greater than n^3


    Also note that in the Big-O-Notation we always compare functions based on their growth. This is why something like O(n^3) does not contain functions f : f(x) <= n^3 but rather f : f(x) <= C * n^3 where C is an arbitrary constant, it could be big, it could be small. This accounts for the growth-factor in the comparison. Also note that it is allowed that the condition does not hold for a finite amount of x but there must exist some bound x' from where on the condition holds, so for every x > x'.

    Compare this explanation to the complete mathematical definition from Wikipedia where C is k, x is n and x' is n_0:

    definition of Big-O

    Which defines, if true, that f(n) is in the set O(g(n)).