Search code examples
pythonrecursionbig-ologarithm

What is the big-Oh runtime of two recursive O(logn) calls?


def f(L):
    if len(L) < 1 billion:
        return L
    else:
        return f(L[:len(L) // 2]) + f(L[len(L) // 2:])

L is a list of size n

I know that if it was a single recursive call, then it would be O(logn), but there are two recursive calls here.

But it started to exhibit more of a O(n) runtime as I began to run it on a visualizer.

In my opinion it should be O(logn+logn) = O(2logn) = O(logn). Am I correct?


Solution

  • Consider how many calls you're doing. At the first level of the recursion you'll do 2 calls. For each of those you'll do two more calls. Etc ... This means that at level i of the recursion you'll have made a total of O(2^i) function calls.

    How many levels of the recursion are there? This is just the height of a binary tree with n elements, which is O(log_2 n).

    So by the time you reach all the leaves of the recursion you will have done O(2^(log_2 n)) = O(n) function calls.

    --

    Another way of looking at it is that you eventually have to piece back together the entire list, so how could you possibly do that in less than O(n) time?