Search code examples
algorithmrecursionbig-ocomplexity-theory

Time complexity of a recursive function with two variables


I have a recursive function that takes two variables and then splits in two calls:

def f(n, m):
    if ((n == 0) or (m == 0)):
        return 1

    return f(n - 1, m) + f(n, m - 1)

The first three recursive calls look like this:

                    f(n-2, m)
                   /
          f(n-1, m)
         /         \
        /           f(n-1, m-1)
       /
f(n, m)
       \
        \           f(n-1, m-1)
         \         /
          f(n, m-1)
                   \
                    f(n, m-2)

I know that a function which splits in two calls will have an exponential time complexity of O(2^n), but how can I also include the second argument and its condition?


Solution

  • What you pass to each function isn't going to change the time complexity.

    Big O is supposed to be worst-case, not best-case. Or at the very least, common case. If you put items into a binary tree in alphabetical order, its performance is going to be O(n), not O(log2 n). But nobody says that insertion performance in a binary tree is O(n), because that's not the common case.