Search code examples
algorithmmathasymptotic-complexityrecurrence

Solving recurrences with iteration, substitution, Master Theorem?


I'm familiar with solving recurrences with iteration:

t(1) = c1
t(2) = t(1) + c2 = c1 + c2
t(3) = t(2) + c2 = c1 + 2c2
...
t(n) = c1 + (n-1)c2 = O(n)

But what if I had a recurrence with no base case? How would I solve it using the three methods mentioned in the title?

t(n) = 2t(n/2) + 1

For Master Theorem I know the first step, find a, b, and f(n):

a = 2
b = 2
f(n) = 1

But not where to go from here. I'm at a standstill because I'm not sure how to approach the question.


Solution

  • I know of 2 ways to solve this:

    (1) T(n) = 2T(n/2) + 1
    (2) T(n/2) = 2T(n/4) + 1
    now replace T(n/2) from (2) into (1)
    
    T(n) = 2[2T(n/4) + 1] + 1
         = 2^2T(n/4) + 2 + 1
    
    T(n/4) = 2T(n/8) + 1
    T(n) = 2^2[2T(n/8) + 1] + 2 + 1
         = 2^3T(n/8) + 4 + 2 + 1
    

    You would just keep doing this until you can generalize. Eventually you will spot that:

    T(n) = 2^kT(n/2^k) + sum(2^(k-1))
    

    You want T(1) so set n/2^k = 1 and solve for k. When you do this you will find that, k = lgn

    Substitute lgn for k you will end up with

    T(n) = 2^lgnT(n/2^lgn) + (1 - 2^lgn) / (1 - 2)
    2^lgn = n so,
    
    T(n) = nT(1) + n - 1
    T(n) = n + n - 1 where n is the dominant term.  
    

    For Master Theorem its really fast

    Consider, T(n) = aT(n/b) + n^c for n>1
    

    There are three cases (note that b is the log base)

    (1)  if logb a < c, T(n)=Θ(n^c),
    
    (2)  if logb a = c, T (n) = Θ(n^c log n),
    
    (3)  if logb a > c, T(n) = Θ(n^(logb a)).
    

    In this case a = 2, b = 2, and c = 0 (n^0 = 1)

    A quick check shows case 3.

    n^(log2 2)
    
    note log2 2 is 1
    
    So by master theorem this is Θ(n)