Search code examples
sortingcomputer-sciencemergesort

Merge sort - recursion tree


So I've been studying sorting algorithms.

I am stuck on finding the complexity of merge sort.

Can someone please explain to me how h=1+lg(n)

enter image description here


Solution

  • If you keep dividing n by 2, you'll eventually get to 1.
    Namely, it takes log2(n) divisions by 2 to make this happen, by definition of the logarithm.

    Every time we divide by 2, we add a new level to the recursion tree.
    Add that to the root level (which didn't require any divisions), and we have log2(n) + 1 levels total.


    Here's a cooler proof. Notice that, rearranging, we have T(2n) - 2 T(n) = 2 c n.

    If n = 2k, then we have T(2k + 1) - 2 T(2k) = 2 c 2k.

    Let's simplify the mess. Let's define U(k) = T(2k) / (2 c).

    Then we have U(k + 1) - 2 U(k) = 2k, or, if we define U'(k) = U(k + 1) - U(k):

    U'(k) - U(k) = 2k

    k is discrete here, but we can let it be continuous, and if we do, then U' is the derivative of U.

    At that point the solution is obvious: if you've ever taken derivatives, then you know that if the difference of a function and its derivative is exponential, then the function itself has to be exponential (since only in that case will the derivative be some multiple of itself).

    At that point you know U(k) is exponential, so you can just plug in an exponential for the unknown coefficients in the exponential, and plug it back in to solve for T.