I have try to understand the Big O Notation worst case runtime. But I still don't quite understand it.
This is some code that I wrote recently:
def g(n):
if n==0:
return 1
elif n==1:
return 2
else:
n_div=n//2
n_mod=n%2
return g(n_div)*g(n_mod)
So I hope that I'm at least right that:
def g(n):
if n==0:
return 1
and:
elif n==1:
return 2
are O(1), so constant.
But what about the else
part.
Is it O(n) because it depends on the n
that I choose?
Can anyone explain what that Big O complexity of the else
part is?
Well you've noticed that you can separate your function into 3 individual cases and already identified that the first 2 are O(1). The third is slightly more tricky but you can separate that into two parts as well.
The recursion clearly occurs from:
g(n//2)*g(n%2)
We can immediately see that n%2
will evaluate either to 0 or 1, which will resolve one of the first 2 cases again, so we can disregard that. Leaving us with g(n//2)
. By rewriting this as a printing loop and examining the output you'll notice something:
>>> n = 37
>>> while n > 1:
... n //= 2
... print(n)
...
18
9
4
2
1
As you can see the terms decrease by half each time, and the same occurs in the recursion. This is logarithmic.
As a result the worst case for this function is O(logn).
You can learn more about what logn actually means in Big-O notation by looking at this question.