Search code examples
algorithmtime-complexityquickselect

Quickselect time complexity explained


From https://en.wikipedia.org/wiki/Quickselect it says

"However, instead of recursing into both sides, as in quicksort, quickselect only recurses into one side – the side with the element it is searching for. This reduces the average complexity from O(n log n) to O(n), with a worst case of O(n^2)."

I dont understand how reducing to only looking at one side reduces average complexity to O(n)? Wouldnt it be more of O(N/2 log N) which is still O(N log N). And how is the worst case O(n^2)


Solution

  • n log(n) implies that the algorithm looks at all N items log(n) times. But that's not what's happening with Quickselect.

    Let's say you're using Quickselect to pick the top 8 items in a list of 128. And by some miracle of random selection, the pivots you pick are always at the halfway point.

    On the first iteration, the algorithm looks at all 128 items and partitions into two groups of 64 items each. The next iteration splits into two groups of 32 items each. Then 16, and then 8. The number of items examined is:

    N + N/2 + N/4 + N/8 + N/16
    

    The sum of that series will never reach 2*N.

    The worst case is that partitioning always results in very skewed partition sizes. Consider what would happen if the first partitioning only removed one item. And the second only removed one, etc. The result would be:

    N + (N-1) + (N-2) ...

    Which is (n^2 + n)/2), or O(n^2).