I don't understand how the runtime of the algorithm can be log(log(n))
. Can someone help me?
s=1
while s <= (log n)^2 do
s=3s
Notation note: log(n)
indicates log2(n)
throughout the solution.
Well, I suppose (log n)^2
indicates the square of log(n)
, which means log(n)
*log(n)
. Let us try to analyze the algorithm.
It starts from s
=1 and goes like 1,3,9,27...
Since it goes by the exponents of 3, after each iteration s
can be shown as 3^m
, m
being the number of iterations starting from 1.
We will do these iterations until s
becomes bigger than log(n)
*log(n)
. So at some point 3^m
will be equal to log(n)*log(n)
.
Solve the equation:
3^m = log(n) * log(n)
m
= log3(log(n) * log(n))
Time complexity of the algorithm can be shown as O(m)
. We have to express m
in terms of n
.
log3(log(n) * log(n))
= log3(log(n))
+ log3(log(n))
= 2 * log3(log(n))
For Big-Oh notation, constants do not matter. So let us get rid of 2.
Time complexity = O(log3(log(n)))
Well ok, here is the deal: By the definition of Big-Oh notation, it represents an upper bound runtime for our function. Therefore O(n) ⊆ O(n^2)
.
Notice that log3(a)
< log2(a)
after a point.
By the same logic we can conclude that O(log3(log(n)) ⊆ O(log(log(n))
.
So the time complexity of the algorithm : O(log(logn))
Not the most scientific explanation, but I hope you got the point.