Search code examples
algorithmtime-complexitycomputability

Why is it assumpted that the time-complexity of multiplication by n is constant?


Regardless of how the multiplication(or division) operation is implemented(i.e. whether it is a software function or a hardware instruction), it won't be solvable in time O(1). for big n values, the processor cannot even compute it by one instruction.

In such algorithms, why are these operations constant and not depended to n?

for (i = 1; i <= n; i++) {
    j = n;
    while (j > 1)
        j = j / 3;    //constant operation
}

Solution

  • Time complexity is not a measure of time. It's a measure of "basic operations" which can be defined how you want. Often, any arithmetic operation is considered a basic operation. Sometimes (for example, when considering the time complexity of sorting algorithms, or hash table operations), the basic operations are comparisons. Sometimes, "basic operations" are operations on single bits (in which case j=j/3 would have time complexity O(log(j)).

    The rules that tend to be followed are:

    • if you're talking about sorting or hashtables, basic operations are comparisons
    • if you're talking about any other practical algorithm, basic operations are arithmetic operations and assignments.
    • if you're talking about P/NP classes, basic operations are the number of steps of a deterministic turing machine. (I think this is equivalent to bit operations).
    • if you're talking about practical algorithms as a complexity theory expert, you'll often assume that basic types have ~log n bits, and that basic operations are arithmetic operations and assignments on these ~log n bit words.