Search code examples
arraysalgorithmsortingdivide-and-conquer

Time and space complexity of divide and conquer k-way-merge algorithm


Given that A is an array of k arrays. Each inner array is sorted and contains m elements.

Given this algorithm for merging K-sorted arrays held in A:

// A is array of sorted arrays  
K-arrays-merge(A)   
1.  if A.length == 1
2.      return first element of A
3.  half = length[A] / 2
4.  firstHalfArray = new array[half][m];
5.  secondHalfArray = new array[half][m];
6.  for (i = 0; i < half; i++)
7.      do firstHalfArray[i] = A[i];
8.  for (i = half; i < length[A]; i++)
9.      do secondHalfArray[i] = A[i];
10. secondHalfArray = Copy second half arrays from A
11. a = K-arrays-merge(firstHalfArray)
12. b = K-arrays-merge(secondHalfArray)
13. return merge(a,b) // This is a merge between two sorted arrays with time complexity of O(n)

Why is the time comelexity of this algorithm is O(m*klogk)?

My first thought is that the recursive method runs logk times, in each round copying the arrays is O(m * k) and the merge-sort is O(m i log (m i)) where 1 <= i <= logk

When looking at the divide and conquer steps:

  1. Divide - calculating the middle point is O(1), copying the arrays is O(mk)
  2. Conquer - 2T(k/2) - The recursive calls.
  3. Merge - merge sort with complexity of O(m i log (m i)) where 1 <= i <= logk.

I do not know how to calculate the time complexity from this. Also what would be the space complexity for this algorithm?


Solution

  • If we suupose n = mk, time complexity would be T(k) = 2T(k/2) + n + n. n for array copy, n for merging and 2T(k/2) for recursive call. Hence, time complexity of the algorithm is O(nlog k) = O(mk log(k)).

    As each time all values of the array copied and stored in the stack, and the depth of stack is O(log k), space complexity would be O(n log k) = O(mk log k).