Search code examples
algorithmtime-complexityimplementation

Are algorithms with high time complexity ever used in the real world for small inputs?


Let's say we have a problem where a certain algorithm, let's call it algorithm_1, solves it in time complexity of O(n^2) and another algorithm, let's call it algorithm_2, solves it in time complexity O(n), but in reality we see that for n < 1000 algorithm_1 is faster and otherwise algorithm_2 is faster.

Why can't we just write code like this:

if ( n < 1000)
  do algorithm_1
else
  do algorithm_2

Is this a real thing programmers do or are there downsides for this?

On a smaller program this seems to be a good idea.


Solution

  • This does happen in the real world! For example, a famous sorting algorithm is Timsort:

    Timsort

    Details of the below implementation:

    We consider the size of the run as 32 and the input array is divided into sub-array.

    We one-by-one sort pieces of size equal to run with a simple insertion sort. After sorting individual pieces, we merge them one by one with the merge sort.

    We double the size of merged subarrays after every iteration.

    Insertion sort has complexity O(N^2) but is faster for tiny lists, Merge Sort has complexity O(N logN) so it is better for longer lists.

    Introsort

    Another example is introsort, the sort used in the C++ standard library:

    Introsort or introspective sort is a hybrid sorting algorithm that provides both fast average performance and (asymptotically) optimal worst-case performance. It begins with quicksort, it switches to heapsort when the recursion depth exceeds a level based on (the logarithm of) the number of elements being sorted and it switches to insertion sort when the number of elements is below some threshold. This combines the good parts of the three algorithms, with practical performance comparable to quicksort on typical data sets and worst-case O(n log n) runtime due to the heap sort. Since the three algorithms it uses are comparison sorts, it is also a comparison sort.

    More complexity downside

    The downside of using more algorithms for a single task is clearly increased complexity. It is worth it if you are writing standard library code for a programming language that will be re-used millions or even billions of times. For smaller projects focusing on saving developer time over machine time by implementing only one algorithm is often the better choice.

    References: