I was given a question in the algorithm class
is 2^{2n} upper bounded by O(3^n)?
Now clear 2^2n is 4^n, and 4^n can't be upper bounded by 3^n.
However if I take log on both sides
On LHS I get 2n
On RHS I get kn (where k is some constant)
2n can be upper bounded by kn, so it is contradicting of the more obvious claim above. What I am doing wrong in this reasoning?
Essentially, your reasoning boils down to this statement:
If log f(n) ≤ c log g(n) for some constant c, then f(n) = O(g(n)).
This statement isn't in general a true statement, though. One way to see this would be to find some counterexamples:
If f(n) = n4 and g(n) = n, then log f(n) = 4 log n and log g(n) = log n. It's true that there's a multiple of log n that's bigger than 4 log n, but n4 ≠ O(n).
If f(n) = 4n and g(n) = 2n, then log f(n) = 2n and log g(n) = n. There's a multiple of n that makes it bigger than 2n, but 4n ≠ O(2n).
To really get at why this doesn't work, notice that c log g(n) = log g(n)c, so multiplying a log by a constant is equivalent to raising the original function to some large power. You can reason about the big-O of a function by multiplying it by a constant, but you can't reason about it by raising it to some power, which is why this reasoning breaks down.