Search code examples
algorithmdivisionmultiplicationadditionsubtraction

does a computer take more time to multiply, divide, subtract, add two big numbers than smaller number


We (people) take more time to multiply, add, divide and subtract two large numbers than two small numbers.

Does a computer take more time to multiply 5 * 2 than say 51234 * 987654 or is the operation done in same amount of time?

What about two numbers that are greater than word size of the processor (e.g. two 128 bit number)?

I saw the article https://en.wikipedia.org/wiki/Multiplication_algorithm


Solution

  • Depends upon the input-type. For primitives that are natively supported by the CPU, such as multiplication of 64bit-numbers on a 64bit-CPU: no, these are atomic operations that always take exactly the same amount of time. For non-primitive data-types, like Java's BigInteger or comparable library-classes in other languages: yes, these aren't atomic operations anymore and thus differ in the amount of time required depending upon the size of the operands.

    Multiplication of primitives always takes the same amount of time due to a simple reason: the operation is build hard-wired without any conditional execution and will always iterate all 64bits on a 64bit CPU, no matter whether the input is only 5bits long or takes up all 64 bits. Same would apply to architectures for any other number of bits.

    EDIT:
    As @nwellnhof stated: some instructions actually do contain branching, such as for example floating-point arithmetic. Usually these instructions are based on microcode and thus can't be counted as atomic instructions in a narrower sense though.