I've been viewing some video lectures from MIT's opencourseware website, and on the third lecture video the lecturer goes over Recursive Matrix Multiplication and comes up with the time complexity being:
T(n) = Θ(n3)
It's obvious to me that I really need to review some of my math, but I really don't see the connection from that answer and any one of the cases mentioned for the Master Theorem Method. I know that the recursive relation is of the form:
T(n) = a*T(n/b) + f(n)
for n > 1.
With this recursive relation: a = 8
, b = 2
, and f(n) = Θ(n2)
.
So, how did they get Θ(n3)
?
He mentioned that log28 = 3
, which makes sense. But, I just can't figure out which of the three cases that the example recursive relation corresponds to, using the value of f(n)
.
Since, Case 2 is the only one where f(n) = Θ(anything)
, then I'm guessing that that's it. Yet, I guess my problem relates to properties of logarithms or exponents.
If f(n) = Θ(nlog28 * (log2n)k+1)
then how do you go from having a Θ(n3)
for f(n)
to a Θ(n2)
using case 2? What is it that I might be missing here?
Check out the Wiki page on the Master Theorem.
They discuss this very exact situation (a = 8, b=2, f(n) = O(n2)) when discussing case 1. (not case 2). Note that if f(n) = Θ(n2) then f(n) = O(n2), so case 1 can be applied.
Hope that helps.