Search code examples
algorithmtime-complexitybig-oasymptotic-complexityrecurrence

How to get the time complexity of this recurrence: T(n) = sqrt(n) * T(sqrt(n)) + n


This recurrence:

T(n) = sqrt(n) * T(sqrt(n)) + n

It does not appear to be solvable with Master theorem. It also does not appear to be solvable with Akra-Bazzi. Even if I set n = 2^k so that T(2^k) = 2^(k/2) * T(2^(k/2)) + 2^k and then have S(k) = T(2^k) it becomes S(n) = 2^(n/2) * S(n/2) + 2^n but the multiplier is not constant, so changing variables doesn't work either.

I am not sure how to derive the closed form, or the time complexity, of this recurrence, if it had been given to me in an interview. What would you do?


Solution

  • I have not used any of the common techniques here.

    Note that there is no base case. Let us consider T(a) = b where a and b are constants as the base case.

    Dividing by 'n', we get: T(n) / n = T(sqrt(n)) / sqrt(n) + 1

    Use g(k) = T(k) / k

    So g(n) = g(sqrt(n)) + 1

    This basically means g(n) is the number of times we can take the sqrt(n) before which we reach the constant base case a.

    That means there is a k such that n^(1/2^k) >= a and n^(1/2^(k+1)) < a.

    Let n^(1/2^k) = a => n = a^(2^k) => lg(n) = 2^k => lg(lg(n)) = k. Then g(n) = k + b = O(log(log(n))).

    This means T(n) = n * O(log(log(n))) = O(n * log(log(n))). Substituting this into the original equation seems to make sense.

    Verification: If you set the constant in the O() notation as 1 and let T(n) = n * lg(lg(n)) where lg(n) is log to base 2, we get

    RHS = sqrt(n) * (sqrt(n) * lg(lg(sqrt(n)))) + n
         = n * lg(1/2 * (lg(n))) + n
         = n * (lg(lg(n)) - 1) + n
         = n * lg(lg(n)) - n + n
         = T(n)
         = LHS