Search code examples
constantsasymptotic-complexitynotation

When we will consider the constants in asymptotic notations?


I think that : ignoring the constants should has a limit !

When the constant become too big we should consider it because it make a huge difference

Is there any rules for that?


Solution

  • Note that when we're talking about asymptotic behaviour, we're describing limiting behaviour.

    If you go into the details of any of the asymptotic analysis notations (Big Oh, Big Theta, Big Omega) you'll realize that constants, for in essence all cases of interest, really have no effect on the outcome of our analysis of a function' or algorithms' asymptotic behaviour.


    As an example, let's look loosely at the definition of Big-O notation:

    f(n) is said to be in O(g(n)) if we can find any positive set of constants k and N, such that |f(n)| ≤ k · |g(n)| holds for all n > N.

    Now, let's say f(n) is linear

    f(n) = n + C
    

    and that C is hideously huge constant. Even in this case, the constant is not of interest for the asymptotic or limiting behaviour of f(n). From the definition above, we can just choose our own constant k large enough so that the following holds

    k·n > C, for all n > N
    

    I.e., in the process of our asymptotic analysis of f(n), we might freely choose k as we want, say even equal to C. This would yield (assuming N>1, hence n>1)

    f(n) = n + C < { n > 1 } < n + n·C = { choose k = C } 
                 = n + n·k < 2kn = { set l = 2k } = l·n
                 = { g(n) = n } = l·g(n)
    

    With this, we've shown that f(n) is in O(n), even if constant term C in f(n) is "hideously huge".

    The same holds for any kind of functions or algorithms you can cook up; constant terms have no effect on the asymptotic behaviour of functions, no matter how large they are (we study limiting behaviour, not state!). Not even for a constant function, say h(n) = C, will the size of C have any effect on the result of our asymptotic analysis (Big-O here); we can just choose k freely an readily derive that h(n) is in O(1) (i.e., constant time: no increased computational time with growth problem size, which is the case for e.g. hash table lookups).


    Now, you could, of course, look at the importance of large constant terms in functions as well as algorithms in a general context, but this wouldn't, based on the above, have anything in particular to do with asymptotic analysis.

    For an example of how the increase of constant in an asymptotic analysis (Big-Oh) of a polynomial have no effect whatsoever on the result of the analysis, see: