Search code examples
time-complexityasymptotic-complexity

Asymptotic complexity for typical expressions


The increasing order of following functions shown in the picture below in terms of asymptotic complexity is:

(A) f1(n); f4(n); f2(n); f3(n)

(B) f1(n); f2(n); f3(n); f4(n);

(C) f2(n); f1(n); f4(n); f3(n)

(D) f1(n); f2(n); f4(n); f3(n)

enter image description here

a)time complexity order for this easy question was given as--->(n^0.99)*(logn) < n ......how? log might be a slow growing function but it still grows faster than a constant

b)Consider function f1 suppose it is f1(n) = (n^1.0001)(logn) then what would be the answer?

whenever there is an expression which involves multiplication between logarithimic and polynomial expression , does the logarithmic function outweigh the polynomial expression?

c)How to check in such cases suppose

1)(n^2)logn vs (n^1.5) which has higher time complexity? 2) (n^1.5)logn vs (n^2) which has higher time complexity?


Solution

  • If we consider C_1 and C_2 such that C_1 < C_2, then we can say the following with certainty

    (n^C_2)*log(n) grows faster than (n^C_1)
    

    This is because (n^C_1) grows slower than (n^C_2) (obviously)

    also, for values of n larger than 2 (for log in base 2), log(n) grows faster than 
    1.
    
    in fact, log(n) is asymptotically greater than any constant C,
    because log(n) -> inf as n -> inf
    
    if both (n^C_2) is asymptotically than (n^C_1) AND log(n) is asymptotically greater 
    than 1, then we can certainly say that 
    (n^2)log(n) has greater complexity than (n^1.5)
    

    We think of log(n) as a "slowly growing" function, but it still grows faster than 1, which is the key here.


    coder101 asked an interesting question in the comments, essentially,

    is n^e = Ω((n^c)*log_d(n))?
    where e = c + ϵ for arbitrarily small ϵ
    

    Let's do some algebra.

    n^e = (n^c)*(n^ϵ)
    so the question boils down to
    is n^ϵ = Ω(log_d(n))
    or is it the other way around, namely:
    is log_d(n) = Ω(n^ϵ)
    

    In order to do this, let us find the value of ϵ that satisfies n^ϵ > log_d(n).

    n^ϵ > log_d(n)
    ϵ*ln(n) > ln(log_d(n))
    ϵ > ln(log_d(n)) / ln(n)
    

    Because we know for a fact that

    ln(n) * c > ln(ln(n))        (1)
    as n -> infinity
    
    We can say that, for an arbitrarily small ϵ, there exists an n large enough to 
    satisfy ϵ > ln(log_d(n)) / ln(n)
    
    because, by (1), ln(log_d(n)) / ln(n) ---> 0 as n -> infinity.
    

    With this knowledge, we can say that

    is n^ϵ = Ω(log_d(n))
    for arbitrarily small ϵ
    
    which means that
    n^(c + ϵ) = Ω((n^c)*log_d(n))
    for arbitrarily small ϵ.
    

    in layperson's terms

    n^1.1 > n * ln(n)
    for some n
    
    also
    
    n ^ 1.001 > n * ln(n)
    for some much, much bigger n
    
    and even
    n ^ 1.0000000000000001 > n * ln(n)
    for some very very big n.