Search code examples
javabigdecimal

Inaccurate Digits due to binary representation in BigDecimals. How do I get around it?


I wanted to write a parser that converts a String to a BigDecimal. It is required that it is 100% accurate. (Well, I am currently programming for fun. So I rather ask for it... ;-P)

So I came up with this programm:

public static BigDecimal parse(String term) {
    char[] termArray = term.toCharArray();

    BigDecimal val = new BigDecimal(0D);
    int decimal = 0;
    for(char c:termArray) {
        if(Character.isDigit(c)) {
            if(decimal == 0) {
                val = val.multiply(new BigDecimal(10D));
                val = val.add(new BigDecimal(Character.getNumericValue(c)));
            } else {
                val = val.add(new BigDecimal(Character.getNumericValue(c) * Math.pow(10, -1D * decimal)));
                decimal++;
            }
        }
        if(c == '.') {
            if(decimal != 0) {
                throw new IllegalArgumentException("There mustn't be multiple points in this number: " + term);
            } else {
                decimal++;
            }
        }
    }

    return val;
}

So I tried:

parse("12.45").toString();

I expected it to be 12.45. Instead, it was 12.45000000000000002498001805406602215953171253204345703125. I know this might be due to limitations of the binary representation. But how can I get around this?

Note: I know that you can just use new BigDecimal("12.45");. But that's not my point - I want to write it on my own, regardless of how stupid this might be.


Solution

  • Yes, it's due to the limitations of binary representation. Any negative power of 10 can't be represented exactly as a double.

    To get around this, replace all double arithmetic with all BigDecimal arithmetic.

    val = val.add(
        new BigDecimal(Character.getNumericValue(c)).divide(BigDecimal.TEN.pow(decimal)));
    

    With this I get 12.45.