Search code examples
big-oinequality

Subtraction is valid in big O-notation?


If we have f(n)=O(g(n)) and h(n)=O(x(n)), then is it true that f(n)-h(n)=O(g(n)-x(n))?

I do not think so, because the inequality in the definition of O notation gets reversed while subtracting x(n) from g(n) , that is; since f(n)<=c_1g(n) for some constant c_1 and h(n)<=c_2x(n) for another constant c_2, it is not clear whether we can directly subtract the terms. Any hints? Thanks beforehand.


Solution

  • You are correct. f(n) being O(g(n)) and h(n) being O(x(n)) does not imply f(n)-h(n) being O(g(n)-x(n)).

    For a counter example just take g(n) = x(n) = 1 and f(n) = 2 and h(n) = 1. This clearly satisfies the assumptions, but f(n)-h(n) = 1, while g(n)-x(n) = 0 and so the claim doesn't hold.

    The only thing you can say about the difference is that it is O(max(|h(n)|, |x(n)|)).

    Note however, that the same is true for addition if you allow negative values on the functions. In programming-related contexts it is generally assumed that the function only takes positive values and in that case subtraction only makes partially sense.