Search code examples
algorithmtime-complexitybig-oasymptotic-complexity

Best and worst case time for Algorithm S when time complexity changes in accordance to n being even/odd


The following is a homework assignment, so I would rather get hints or bits of information that would help me figure this out, and not complete answers.

Consider S an algorithm solution to a problem that takes as input an array A of size n. After analysis, the following conclusion was obtained:

  • Algorithm S executes an O(n)-time computation for each even number in A.

  • Algorithm S executes an O(logn)-time computation for each odd number in A.

What are the best and worst case time for algorithm S?

From this I understand that the time complexity changes in accordance to n being even or odd. In other words, if n is even, S takes O(n) time and when n is odd, S takes O(logn).

Is it a simple matter of taking the best case and the worst case of both growth-rates, and choosing their boundaries? Meaning:

Best case of O(n) is O(1), and worst case is O(n).

Best case of O(logn) is O(logn) and worst case is O(logn).

Therefore the best case for Algorithm S is O(logn) and the worst case is O(n)?

Am I missing something? or am I wrong in assessing the different best/worst case of both cases of big-Oh?


1st attempt:

Ok, so I completely misunderstood the problem. Thanks to candu, I can now better understand what is required of me, and so try to calculate the best and worst case better.

It seems that Algorithm S changes its runtime according to EACH number in A. If the number is even, the runtime is O(n), and if the number is odd, we get O(logn).

The worst case will be composed of an array A of n even numbers, and for each the algorithm will run O(n). In other words, the worst case runtime for Algorithm S should be n*O(n).

The best case will be composed of an array A of n odd numbers, and for each the algorithm will run O(logn). The best case runtime for algorithm S should be n*O(logn).

Am I making any sense? is it true then that:

Best case of algorithm S is nO(logn) and worst case is nO(n)?

If that is true, can it be rewritten? for example, as O(log^n(n)) and O(n^n)? or is this an arithmetic mistake?


2nd attempt:

Following JuanLopes' response, it seems like I can rewrite nO(n) as O(n*n) or O(n^2), and nO(logn) as O(nlogn).

Does it make sense now that Algorithm S runs at O(nlogn) at the best case, and O(n^2) at the worst case?


Solution

  • There's a bit of confusion here: the algorithm runtime doesn't depend on n being even or odd, but on whether the numbers in A are even or odd.

    With that in mind, what sort of input A would make Algorithm S run faster? Slower?

    Also: it doesn't make sense to say that the best case of O(n) is O(1). Suppose I have an algorithm ("Algorithm Q") that is O(n); all I can say is that there exists a constant c such that, for any input of size n, Algorithm Q takes less than cn time. There is no guarantee that I can find specific inputs for which Algorithm Q is O(1).

    To give a concrete example, this takes linear time no matter what input it is passed:

    def length(A):
      len = 0
      for x in A:
        len += 1
      return len