I am figuring out the time taken on my computer to calculate the product of two n digit integers.
I am using this code to do so:
import timeit
for i in range(50):
avg=0
for j in range(30):
avg+=timeit.timeit('a*b','a='+str(10**i)+';b='+str(10**i))
print(avg/30)
It returns results that graph to:
Where the X axis is n and the Y axis is the time taken in seconds. As you can see, the time taken increases at about n when it is a multiple of 10, and is not constantly increasing.
I do not understand why the time taken varies like this.
A Python int's magnitude is stored as a sequence of 30-bit chunks (or sometimes 15-bit chunks, but that's rarer these days). Making your numbers 9 decimal digits longer takes about 30 bits, and the time taken to multiply two numbers depends mostly on how many 30-bit chunks each number takes, so the times increase at increments of about 9 decimal digits.