Right now I'm learning about average case analysis and amortized case analysis, and I began to wonder when exactly you would use average case analysis and when to use amortized case analysis. Also do they both accomplish the same thing?
Average-case analysis is distinct from amortized analysis. The difference is subtle.
Average case analysis studies is the expected runtime when selecting a single problem instance randomly from the set of all problem instances of a given size, subject to some specific probability distribution over those problem instances.
Amortized analysis studies is the expected aggregate runtime when solving instances of multiple problems in a specific order.
Consider the following algorithm:
dict = {}
Foo(bar[1...n])
1. if dict[bar] exists, return dict[bar]
2. if n is even then
3. dict[bar] = bar[1] if n > 0, or else 0 if n = 0
4. else if n is odd then
5. dict[bar] = foo(bar[1...(n-1)/2]
6. return dict[bar]
Average case analysis: suppose all lists of length n are equally likely (either assume that the values in the array are taken from a finite set, or that different arrays can be put into finitely many equivalence classes based not on absolute, but relative, sizes). The average-case runtime for inputs of size n is thus O(1) when n is even, and O(log(n)) when n is odd (in the worst case, n=1, 3, 7, 15, 31, ..., and you have to subtract and one and halve until you get all the way down to 1).
Amortized analysis: suppose you want to run this algorithm on arrays containing only the number 1, of lengths beginning with 0 all the way k. running for n = 0 takes constant time. Running for n = 1 does just one recursive step because we cached the result for n = 0. Continuing on, we fine all executions take just one recursive step because our sequence of executions has cached all results for recursive calls. Therefore, to execute k times, we do O(k) work; this means each individual execution has an amortized complexity of O(1).