Search code examples
algorithmrecursiontime-complexityrecurrencedivide-and-conquer

Master theorem solution for the case d=log_b(a)


From Dasgupta's Algorithms: if the running time of a divide and conquer algorithm is described by the recurrence T(n)=aT(n/b)+O(n^d), then its solution is:

  1. T(n)=O(n^d) if d>log_b(a)
  2. T(n)=O(n^log_b(a)) if d<log_b(a)
  3. T(n)=O(n^d*log_2(n)) if d=log_b(a)

where each subproblem's size is decreasing by b in the next recursion call, a is the branching factor and O((n/b^k)^d) is the time for deviding and combining the subproblems on the level k for each subproblem.

Cases 1 and 2 are straightforward - they are taken from the geometric series formed when summing the work done at each level of the recursion tree, which is a^k*O((n\b^k)^d)=O(n^d)*(a/b^d)^k.

Where does the log_2(n) come up from in case 3? When d=log_b(a), the ratio a/b^d equals 1, hence the sum of the series is n^d*log_b(a), not n^d*log_2(n)


Solution

  • As a simpler example, first note that O(log n), O(log137 n), and O(log16 n) mean the same thing. The reason for this is that, by the change of basis formula for logarithms, for any fixed constant m we have

    log_m n = log n / log m = (1 / log m) · log n = O(log n).

    The Master Theorem assumes that a, b, and d are constants. From the change of basis formula for logarithms, we have that

    In that sense, O(nd logb n) = O(nd log n), since b is a constant here.

    As a note, it’s unusual to see something written out as O(nd log2 n), since the log base here doesn’t matter and just contributes to the (already hidden) constant factor.