Currently I have this sorting algorithm:
public static void sort(int[] A, int i, int j) {
if (i == j) return;
if(j == i+1) {
if(A[i] > A[j]) {
int temp = A[i];
A[i] = A[j];
A[j] = temp;
}
}
else {
int k = (int) Math.floor((j-i+1)/3);
sort(A,i, j-k);
sort(A,i+k, j);
sort(A,i,j-k);
}
}
It's sorting correctly, however, the asymptotic comparison is quite high: with T(n) = 3T(n-f(floor(n/3)) and f(n) = theta(n^(log(3/2)3)
Therefore, I'm currently thinking of replacing the third recursion sort(A,i,j-k)
with an newly written, iterative method to optimize the algorithm. However, I'm not really sure how to approach the problem and would love to gather some ideas. Thank you!
If I understand this correct, you first sort the first 2/3 of the list, then the last 2/3, then sort the first 2/3 again. This actually works, since any misplaced items (items in the first or last 2/3 that actually belong into the last or first 1/3) will be shifted and then correctly sorted in the next pass of the algorithm.
There are certainly two points that can be optimized:
However, after the second "optimization" you will in fact have more or less re-invented Merge Sort.