Search code examples
pythonsumdigitssum-of-digits

Python sum of digits error


While solving Project Euler problems with Python (which I am a beginner at), I got this following error. The question is to find the sum of the digits of 2^1000. For that I wrote the following code :

sum=0
x=2**1000
while(x):
  sum += x%10
  print(sum) #Just to check whats happening
  x /= 10

print("\n"*5)
print("Sum = ",sum)

For this, I get decimal added up somewhere in between.

Output :

6
10.0
10.0 
12.0
16.0

....

1116.0
1122.0
1131.625  #Why does the decimal get added?
1138.59375

 .....

 1181.495136589947
 1186.5812084526442
 1188.089815638914
 1195.240676357541
 1195.9557624294036
 1197.0272710365898
 1197.1344218973084
 1197.1451369833803
 1197.1462084919874

 .....
 1197.1463275484991 #This number gets repeated a lot of times
 1197.1463275484991
 1197.1463275484991



 Sum = 1197.1463275484991

Please explain what's going on and help.


Solution

  • Use integer division instead of floating point:

    x //= 10