Search code examples
algorithmamortized-analysis

What happens to amortized analysis in binary counter if flipping a bit at index k costs now 2^k instead of 1?


Suppose that flipping bit #i costs 2i; so, flipping bit #0 costs 1, flipping bit #1 costs 2, flipping bit #2 costs 4, and so on.

What is the amortized cost of a call to Increment(), if we call it n times?

I think the cost for n Increments should be n·20/20 + n·21/21 + n·22/22 + … +n·2n−1/2n−1 = n(n−1) = O(n2). So each Increment should be O(n), right? But the assignment requires me to prove that it's O(log n). What have I done wrong here?


Solution

  • Let we have p-bit integer and walk through all 2^p possible values.

    0 0 0
    0 0 1
    0 1 0 
    0 1 1
    1 0 0 
    1 0 1 
    1 1 0
    1 1 1
    

    The rightmost bit is flipped 2^p times (at every step), so overall cost is 2^p
    The second bit is flipped 2^(p-1) times but with cost 2, so we have overall cost 2^p
    ..
    The most significant bit is flipped once, but with cost 2^p, so we have overall cost 2^p

    Summing costs for all bits, we have full cost p*2^p for all operations.

    Per-operation (amortized) cost is p*2^p / 2^p = p

    But note this is cost in bit quantity, and we must express it in N terms

    N = 2^p 
     so 
    p = log(N)
    

    Finally amortized complexity per operation is O(log(N))