Search code examples
algorithmrecursionbig-oasymptotic-complexitycode-complexity

Calculating Big O complexity of Recursive Algorithms


Somehow, I find that it is much harder to derive Big O complexities for recursive algorithms compared to iterative algorithms. Do provide some insight about how I should go about solving these 2 questions.

*assume that submethod has linear complexity

def myMethod(n)
    if (n>0)
    submethod(n)
    myMethod(n/2) 
    end
end 

def myMethod(k,n)
    if(n>0)
    submethod(k)
    myMethod(k,n/2) 
    end 
end

Solution

  • For your first problem, the recurrence will be:

        T(n) = n + T(n/2)
    
        T(n/2) = n/2 + T(n/4)
        ...
        ...
        ...
        T(2) = 2 + T(1)
    
        T(1) = 1 + T(0) // assuming 1/2 equals 0(integer division)
    
    adding up we get:
    
        T(n) = n + n/2 + n/4 + n/8 + ..... 1 + T(0)
    
             = n(1 + 1/2 + 1/4 + 1/8 .....) + k // assuming k = T(0)
    
             = n*1/(1 - 1/2)  ( sum of geometric series a/(1-r) when n tends to infinity)
    
             = 2n + k
    

    Therefore, T(n) = O(n). Remember i have assumed n tends to infinity ,cause this is what we do in Asymptotic analysis.

    For your second problem its easy to see that, we perform k primitive operations everytime till n becomes 0. This happens log(n) times. Therefore, T(n) = O(k*log(n))