Search code examples
asymptotic-complexity

Asymptotic costs confusion


So i'm trying to get my head around different asymptotic costs, I can tell generally which are faster but can't really get to grips as to when they occur /why, can anybody explain to me when some of the examples below would occur? or even any other complexities

enter image description here


Solution

  • Knowing how a complexity occurs is largely a matter of understanding where factors arise and putting them together in some way. If the operations being performed are done in a correlated manner with other operations, then the complexities are multiplied; if they are performed in series the complexities are added.

    List traversals, for example, are each order n; so you might know right off the bat that O(n²) complexities arise from a doubly nested loop algorithm, O(n³) from a triply nested one.

    Now writing complexities with multiple terms is not, in my experience, a common practice - O(n² + 2n + 1) for example. This is because we typically care about complexities for large n and in such a case, the lower order terms eventually become negligible. However, you can visualize such a thing by a nested loop algorithm in which two separate steps are performed in the outer loop that's O(1) (maybe adding something to a collection), plus a single O(1) operation outside of the loops.

    Similarly, tree structures often yield logarithmic complexities. The height of a complete and balanced binary tree is proportional to log2(n), where n is the number of nodes in the tree. Thus, finding something in that tree using a recursive algorithm - which can be thought of as stepping through the tree's levels - is log2(n).

    Note that in the worst case that the binary tree isn't complete and balanced (ie. each node has only one child filled in), the tree is topologically identical to a list, and thus searching through it is O(n). That's why many algorithms list a range of complexities.