Search code examples
algorithmtime-complexityrecurrence

Get the complexity of T(n)=T(n/4)+T(3n/4)+c


In this recurrence relation T(n)=T(n/4)+T(3n/4)+c , I am having just confusion that what is the relation of this recurrence relation with the best and worst case analysis since we have to solve both the sub-problems which are of size n/4 and 3n/4 so what is the terminology of worst case or best case analysis here ?

Moreover we should use here theta(log n ) our O(log n ) ,although seeing the below link I found O(log n ) more applicable but still couldn't get why are we not using theta(log n ) here .

How to solve the recursive complexity T(n) = T(n/4)+T(3n/4)+cn


Solution

  • T(n) = T(n/4) + T(3n/4) + CONST <= 2T(3n/4) + CONST
    

    We will use case 1 of master theorem with:

    a = 2, b = 4/3.
    c = log_{4/3}(2) ~= 0.4
    CONST is in O(n^0.4)
    

    Thus, from master theorem, one cad derive that 2T(3n/4) + CONST is in Theta(logn), and since T(n) <= 2T(3n/4) + CONST, we can say that T(n) is in O(logn).

    By following the same idea, but with lower bound:

    T(n) >= T(3n/4) + CONST ...
    

    And using master theorem again, we can tell that T(n) is also in Omega(logn).

    Since T(n) is both O(logn) and Omega(logn), it is also Theta(logn).


    As for your question, you can use either big-O or Theta notation, whatever you prefer. As you can see, proving Theta requires a bit more work, but it is also more informative, as it tells you the bound you found is tight.