Search code examples
algorithmtime-complexitybig-ocomplexity-theory

Why is O(log(n)) coming equal to O(log(n!))?


While solving the complexity of a code, I found it as O(log(n!)). I know that this can be proven equal to O(n*log(n)). However, can someone tell where this proof is going wrong?

Theorems used:

  • log(ab) = log(a) + log(b)
  • O(a+b) = O(max(a,b))

Proof

O(log(n!)) = O(log(n*(n-1)*(n-2)*...*2*1))

           = O(log(n) + log(n-1) + ... )

           = O(max(logn, log(n-1), ...))

           = O(log(n))

Can someone tell where I'm going wrong?


Solution

  • You cannot say O(log(n) + log(n-1) + ... ) = O(max(logn, log(n-1), ...))

    This is only true for a constant number of summands. In your case the number depends on n.

    Otherwise you could also proof

    O(n)=O(1+1+1+1+1+...1) = O(max(1,1,1,...))= O(1)