Search code examples
carraysrecursiontime-complexitydivide-and-conquer

Second greatest element of an array using divide and conquer


I wrote this function to find the 2nd greatest element of an array, but i have some doubts about its time complexity. Does the if condition have θ(1) or it increases time complexity of recursive calls?

From an experimental point it should be not greater than the greatest maximum element with divide and conquer strategy time complexity.

int secondmax(int arr[], int first , int last){
   if(first+1==last) return arr[first];
   int mid= first +(last-first)/2;
   int left = secondmax(arr, first, mid);
   int right = secondmax(arr, mid, last);
   if(  (left > right ? left : right) > max1){
      max2=max1;
      max1= left > right ? left : right;
   }

    else if((left > right ? left : right) > max2  &&     (left > right ? left : right) != max1){
       max2= left > right ? left : right;

   }
   return left > right ? left : right;
}

ps max1, max2 are global variables, probably i could cut max1


Solution

  • if isn't a significant contributer in terms of time complexity when placed in front of recursive calls. Given your algorithm, what would matter the most is the splitting of the array done using recursion in terms of contribution to the time complexity. Doing a dry run on your algorithm would allow you to see that your algorithm will follow the following recurrence equation : T(n) = 2T(n/2) + θ(1). Note that θ(1) over here is a indication of a constant unit of time consumed by the instructions that you wrote after your 2 recursive function calls. This θ(1) will be same as long as the no. of instructions are constant after the recursive calls. If you would have used a loop whose running time was dependent on the input array's length, then instead of θ(1), θ(n) would have come in to picture.

    So, how does θ(1) and θ(n) affect your algorithm's complexity? It can be simply determined by using the good old Master's Method on your recurrence equation.

    With θ(1), your equation would be : T(n) = 2T(n/2) + θ(1) & with θ(n) it would become T(n) = 2T(n/2) + θ(n). After applying Master's Theorem to both the cases, you would get the final time complexity as θ(log n) for first one and θ(n [log n]) for the second. So, that's the main difference that you need to remember.