Search code examples
big-ocomplexity-theorybinary-searchtheoryasymptotic-complexity

How is Asymptotic Complexity Derived for Recursive Functions


I haven't found a general answer to this on the website

For Example

If I have some algorithm; say Binary Search how do I derive (mathematically show) its complexity is O(log(n)).

But more generally how do I derive the asymptotic complexity of any recursive algorithm?


Solution

  • The time complexity of many recursive algorithms can be expressed in terms of itself, just like the algorithm. This is called a recurrence relation, and is typically of the format

    T(N) = [sum of T(N_i) for recursive call i] + [sum of other work done in the function]

    For example, binary search has a very simple recurrence relation: for each recursive call 1) the search space is halved, and 2) a constant amount of work is done. Thus the relation is of the form T(N) = T(N / 2) + C. To solve this, repeatedly substitute it and spot a pattern:

    T(N) = T(N / (2^1)) + C
         = T(N / (2^2)) + 2C
         = T(N / (2^3)) + 3C
         = ... 
         = T(N / (2^m)) + mC
    

    Binary search terminates when the search space is just one element, i.e. when N / (2^m) = 1. This corresponds to m = log2(N), and T(N / (2^m)) = 0.

    Thus the time complexity is O(m) = O(log N). (The base of the log does not matter and neither does C, since they are both multiplicative constants)