algorithmrecursiontime-complexitybubble-sortmaster-theorem

# Time complexity of mixedsort (a modification on bubblesort)

So i tried solving this with master theorem but unsure if my solution is correct:

``````function MixedSort(A[1..n]){
if (n==1) {
return A[1];
}
m = n/2;
B[1 ... m] = bubbleSort(A[1...m]) //normal Bubblesort function O(m^2)
B[m+1 ... n] = MixedSort(A[m+1 ... n])
return Merge(B[1 ... m], B[m+1 .. n]) //merges 2 sorted arrays in O(n)
}
``````
• T(1) = 1
• T(n) = T(n/2)+n^2+n = T(n/2)+n^2

then used mastertheorem and got O(n^2) because 1<2^2=4

Solution

• It is `O(n^2)`, yes. You don't even need the master theorem to see this. The recurrence relation is the following: (The `O((n/2)^2)` and `O(n)` terms are from the bubble sort and the merge.)

``````T(n) = T(n/2) + O((n/2)^2) + O(n)
``````

But `O((n/2)^2)` is equivalent to `O(n^2)` and `O(n^2)` dwarfs `O(n)` such that the `O(n)` term can just be omitted yielding

``````T(n) = T(n/2) + O(n^2).
``````

The recurrence `T(n) = T(n/2) + n^2` can be solved using the recursive tree method. At each level of the recursion, the problem size is halved (n/2), and it takes `O(n^2)` time to process each level.

The total time complexity is the sum of the work done at each level, which forms the series:

``````T(n) = n^2 + (n^2 / 4) + (n^2 / 16) + ... + (n^2 / 2^k)
``````

Since all the terms of that series are constants times n^2 we can rearrange as

``````T(n) = n^2 * (1 + 1/4 + 1/16 + ...)
``````

That series converges to a constant so the running time is O(n^2).