I have two questions about analyzing time complexity using recurrence relations.
Question 1. How to form recurrence relations for an algorithm when using memoization? Is this even possible?
For example consider computing the nth term of the Fibonacci sequence. The algorithm would be:
fib(n)
if n < 2
return n
else
return fib(n-1)+fib(n-2)
The recurrence relation for this piece of code is:
T(n) = T(n-1) + T(n-2) + c. The time complexity of this is roughly 2^n.
The memoized version of this code would be:
MemFib(n)
if n < 2
return n
else
if F[n] is undefined
F[n] = MemFibo(n − 1) + MemFibo(n − 2)
return F[n]
(Note: I took this from Jeff Erickson's notes which can be found here: http://jeffe.cs.illinois.edu/teaching/algorithms/. This algorithm is in chapter 5 - Dynamic Programming)
The complexity of this is O(n). I understand why this is O(n) because we are storing values in the table 'F' and performing lookups whenever possible however I do not know how to prove this mathematically.
Right now to me it looks like the recurrence relation is the same as the first example (T(n-1)+T(n-2)+c) which is obviously wrong. How would you write the recurrence relation of this? Is it even possible to write a relation for this because we do not perform recursive calls all the time? If it is not possible, how would you prove time complexity is O(n) formally?
Question 2. How to analyze the complexity of algorithms which are in the following format:
T(n) = aT(n/b) + cT(n/d) + x?
Here x is a constant.
For the first question, the recurrence relation is not true any more because T(n) could have two values depending on if it has already been called or not (O(n) or O(1)).
A way to write the recurrence would be to differentiate first and second call :
Tf(n) = Tf(n-1) + Ts(n-2) + c = Tf(n-1) + O(1) + c = O(n).
For the second question, it is possible to extend the master theorem:
In your specific example with x = O(1):
T(n) = θ(n^γ) with γ such as a*(1/b)^γ + c*(1/d)^γ = 1