I'm currently taking a class in algorithms. The following is a question I got wrong from a quiz: Basically, we have to indicate the worst case running time in Big O notation:
int foo(int n)
{
m = 0;
while (n >=2)
{
n = n/4;
m = m + 1;
}
return m;
}
I don't understand how the worst case time for this just isn't O(n). Would appreciate an explanation. Thanks.
foo
calculates log4(n) by dividing n
by 4 and counting number of 4's in n
using m
as a counter. At the end, m
is going to be the number of 4's in n
. So it is linear in the final value of m
, which is equal to log base 4 of n
. The algorithm is then O(logn)
, which is also O(n)
.