I have been given the following pseudocode:
j = 1
while j < n:
k = 2
while k < n:
k = k*k
j++
In my thinking, this piece of pseudocode would have the following complexity:
O(n*log(n))
Since the outer loop is executing n times. While the inner loop is essentially splitting the increment step by half each time. Is my thinking too far off?
edit: 1 more (these aren't homeworks, I promise, just examples to understand)
for i = 1 to n:
for j = 1 to n:
k = j*j
while k < n:
k++
In this instance, the outermost loop will execute n times. The middle loop will also execute n times, putting us now at n2 times. The innermost loop, as I under stand it will execute log(n) times, putting us at O(n2*log(n)) times. Is my understanding correct?
It's O (n log log n)
.
The outer loop just repeats the inner loop n
times as far as time is concerned, so it contributes a multiplier of n
.
The inner loop is trickier, it does repeated squaring of k
.
See how it goes:
2^1 -> 2^2 -> 2^4 -> 2^8 -> 2^16 -> 2^32 -> ...
So, for example, if n = 2^32
, the loop will have 5
iterations.
Here, log_2 (n)
is 32
, and log_2 (32)
is 5
.
Generally, if n = 2^(2^r)
, the inner loop will arrive at n
after r
iterations.
By taking the logarithm, we arrive at log n = 2^r
.
By taking the logarithm another time, we have log log n = r
.
As you probably know, the base of the logarithm is not important when dealing with asymptotic behavior, as long as it is constant.
So, we have n
iterations of a loop which itself makes log log n
iterations, making the overall complexity O (n log log n)
.