There is a theorem in Cormen which says... (Th 8.1) "For comparison based sorting techniques you cannot have an algorithm to sort a given list, which takes time less than nlogn time (comparisons) in the worst case" I.e. the worst case time complexity is Omega (nlogn) for Comparison based sorting technique...
Now what I was searching is that whether there exists a statement in case of the best case..or even for avg case Which states something like:
You cannot have a sorting Algorithm which takes time less than some X to sort a given list of elements...in the best case
Basically do we have any lower bound for best case Algorithm. Or even as a matter of fact for average case. (I tried my best to find this, but couldn't find anywhere). Please also tell me whether the point I am raising is even worth it.
Great question! The challenge with defining “average case” complexity is that you have to ask “averaged over what?”
For example, if we assume that the elements of the array have an equal probability of being in any one of the n! possible permutations of n elements, then the Ω(n log n) bound on comparison sorting still holds, though I seem to remember that the proof of this is fairly complicated.
On the other hand, if we assume that there are trends in the data (say, you’re measuring temperatures over the course of a day, where you know they generally trend upward and then downward). Many real world data sets look like this, and there are algorithms like Timsort that can take advantage of those patterns to speed up performance. So perhaps “average” here would mean “averaged over all possible plots formed by a rising and then falling sequence with noise terms added in.” I haven’t encountered anyone working on analyzing algorithms in those cases, but I’m sure some work has been done there and there may even be some nice average case measures there that are less well known.