I'm trying to find the amortized cost per operation in a sequence of n operations on a data structure in which the ith operation costs i if i is an exact power of 3, and 1 otherwise
can anyone explain me how to solve the problem
I found O(6), i dont know whether it is correct or wrong.
For a given n
, there are floor(log_3(n))
'expensive' operations with cost i
, the rest requiring O(1)
each.
O(1/n * (n - floor(log_3(n)) + sum_k=0..floor(log_3(n)) { 3^k }) )
= O(1/n * (n + (3^(floor(log_3(n))+1) - 1) / (3 - 1) ) ) // applying the formula for a finite sum of a geometric series
= O(1/n * (n + c * n) )
= O(1)
Note that the Big-Oh notation abstracts away from constant factors, so O(6)
does not make much sense.