Search code examples
algorithmlogarithmasymptotic-complexity

Adding a log in asymptotic analysis


Have a problem I'm trying to work through and would very much appreciate some assistance! What's the time complexity of...

for (int j = 1 to n) {
    k = j;
    while (k < n) {
        sum += a[k] * b[k];
        k += log n;
    }
}

The outer for loop runs n times. I'm not sure how to deal with k+= log n in the inner loop. My thought is that it's O(n^2). Adding log(n) to k isn't quite getting an additional n loops, but I think it is less than O(n*log n) would be. Obviously, that's just a guess, and any help in figuring out how to show that mathematically would be greatly appreciated!


Solution

  • You can treat log(n) as a constant here, sort of.

    Each iteration of the loop will perform a constant amount of work (sum+=...; k+=...) a number of times equal to n/log(n). There are n iterations of the loop. The total work is thus n^2 / log(n).

    Any time you see a bunch of operations like so:

     ---------------------b-------------------------
     |O(blah) + O(blah) + O(blah) + O(blah) + O(blah)
     |O(blah) + O(blah) + O(blah) + O(blah)     .
    a|O(blah) + O(blah) + O(blah)               .
     |O(blah) + O(blah)                         .
     |O(blah)     .         .         .         .
    

    It is a*b * O(blah) -- just imagine the square (where I put the .s). It is a constant fraction of a 2D rectangle (half of a rectangle), hence the O(a*b).

    In the above case, b=n, a=n/log(n), and O(blah)=O(1)(from the inner loop)