I was reading about efficiency of Radix sort compared to comparison based sorting algorithms, where i found this piece :
For example, consider a bottom-up merge sort. The first pass will compare pairs of random keys, but the last pass will compare keys that are very close in the sorting order. This makes merge sort, on this class of inputs, take O(n (log n)2) time.
Can someone help me understand the O(n (log n)2)
part ?
Assuming constant compare time == O(1), then merge sort is O (n log(n)). Assuming compare time is O(log(n)), then merge sort is O(N (log(n))^2). The assumption here is based on minimum key size for n distinct keys, which is log2(n) bits. However, key size is usually independent of n, and is either constant, or the average size of all keys in a set of n elements would be constant, in which case the time complexity is O(n log(n)).
The point being made is more about radix sort, where the number of passes is based on how many parts the key is separated into. For example, if the key is 64 bits, and radix sort is done 4 bits at a time, it takes 16 passes. If the key size is 28 characters, and radix sort is done 1 character at a time, then it takes 28 passes. If considering the minimum key size for n distinct keys, which is log2(n) bits, and if sorting by 8 bits at a time, then it takes log256(n) passes. For example sorting 2^32 keys, means the keys are 32 bits in size, and sorting by 8 bits at a time will take 4 passes. For 2^64 keys, it's 64 bits per key and 8 passes.