I am getting strange output when I add doubles together. Can someone tell me why I'm getting repeating decimals when I am adding 0.1 every time?
I have worked out the formula for adding these numbers together and I have done this myself on paper up to 3.3...
The sum of all numbers (decreasing by 1 tenth) from 3.3 to 1 equals 51.6
3.3
3.2
3.1 +
3.0
...
1.0
_
51.6
There is an easier way to calculate this using two formulas:
The linear formula for the increasing number: Y = 0.1X + 1
And the sum of increasing numbers formula: [X * (Y + 1)]/2 = total
first solve for Y using any number (in this case 100)
11 = 0.1(100) + 1
Then solve for the total using X and Y
[100 * (11+1)]/2 = 600
The output of the following code should be 600 I believe. There is no question that it should not have a repeating decimal. What am I doing wrong here? There must be something I missed.
public static void main(String[] args) {
int days = 100;
double inc = 0.1;
double init = 1;
double total = 0;
for (int i = 1; i <= days; i++) {
if (i == 1) {
total = total + init;
} else {
init = init + inc;
total = total + init;
}
}
System.out.println("Total: " + total);
System.out.println("Daily: " + init);
}
Please read the link that Don Roby posted. In essence, double precision is not a good way to represent fractions. A number like 0.1
does not have an exact representation in binary float notation - because floating point numbers are written as "something times two to the power something else". And you cannot solve that exactly for 0.1
. Thus, you are really getting a slightly smaller number - actually
0.0999999999999998
And that missing amount is enough to mess up the math…
See Jon Skeet's very excellent answer on this topic: https://stackoverflow.com/a/1089026/1967396