I know that we can apply the Master Theorem to find the running time of a divide and conquer algorithm, when the recurrence relation has the form of:
T(n) = a*T(n/b) + f(n)
We know the following :
a
is the number of subproblems that the algorithm divides the original problemb
is the size of the sun-problem i.e n/b
f(n)
encompasses the cost of dividing the problem and combining the results of the subproblems.Now we then find something (I will come back to the term "something") and we have 3 cases to check.
f(n) = O(n^log(b)a-ε)
for some ε>0; Then T(n)
is O(n*log(b)a)
f(n) = O(n^log(b)a)
; Then T(n)
is O(n^log(b)a * log n)
.n^log(b)a+ε = O(f(n))
for some constant ε > 0
, and if a*f(n/b) =< cf(n)
for some constant
c < 1
and almost all n, then T(n) = O(f(n))
.All fine, I am recalling the term something. How we can use general examples (i.e uses variables and not actual numbers) to decide which case the algorithm is in?
In instance. Consider the following:
T(n) = 8T(n/2) + n
So a = 8
, b = 2
and f(n) = n
How I will proceed then? How can I decide which case is? While the function f(n) = some big-Oh notation how these two things are comparable? The above is just an example to show you where I don't get it, so the question is in general.
Thanks
As CLRS suggests, the basic idea is comparing f(n)
with n^log(b)a
i.e. n to the power (log a to the base b). In your hypothetical example, we have:
f(n) = n
n^log(b)a = n^3
, i.e. n-cubed as your recurrence yields 8 problems of half the size at every step.Thus, in this case, n^log(b)a
is larger because n^3
is always O(n) and the solution is: T(n) = θ(n^3)
.
Clearly, the number of subproblems vastly outpaces the work (linear, f(n) = n
) you are doing for each subproblem. Thus, the intuition tells and master theorem verifies that it is the n^log(b)a
that dominates the recurrence.
There is a subtle technicality where the master theorem says that f(n)
should be not only smaller than n^log(b)a
O-wise, it should be smaller polynomially.