I have a method with two double
values being inputted. When trying to add them together I get incorrect values over a certain threshold so I started using BigDecimal
.
However even with BigDecimal
I still have incorrect values?
double value1 = 2789.45;
double value2 = 557.89;
System.out.println(BigDecimal.valueOf(value1 + value2));
prints
3347.3399999999997
when It should read as
3347.34
How can I do this correctly even if value1
and value2
could be higher than current scope? (they are calculated in a separate method).
Should I just use rounding?
Should I just use rounding?
NOPE, what you are experiencing is lose of precision of double
sum.
You are suming the doubles first (value1 + value2
) and after converting the double sum (which has lost precision) to a BigDecimal
.
To avoid this use this instead:
double value1 = 2789.45;
double value2 = 557.89;
System.out.println(BigDecimal.valueOf(value1).add(BigDecimal.valueOf(value2)));
OUTPUT:
3347.34
Working IDEONE DEMO here
UPDATE
-1. Instead of using
BigDecimal.valueOf
, which is still fragile, you should be usingBigDecimal
from the beginning and create them withnew BigDecimal("2789.45")
,new BigDecimal("557.89")
. As soon as you've used adouble
literal, you've introduced imprecision.BigDecimal.valueOf
tries to get it back, but it doesn't always work. –Louis Wasserman
Actually I'm not totally agree with this. You cannot hardcode the values creating new decimals, what I can be agree is to read the values (if possible for the OP) in a String
directly to avoid double
precision lost:
String value1 = "2789.45";
BigDecimal one = new BigDecimal(value1);
String value2 = "557.89";
BigDecimal two = new BigDecimal(value2);
System.out.println(one.add(two));
OUTPUT:
3347.34
This will avoid the problems Louis Wasserman
is noticing.
NEW WORKING DEMO