Given the simple program
import java.math.*;
import static java.math.BigDecimal.ONE;
import static java.lang.System.out;
public static void main(String[] args) {
StringBuffer ruler = new StringBuffer(" ");
for (int i = 0; i < 5; i++) {
ruler.append("1234567890");
}
out.println(ONE.divide(new BigDecimal(47), 50, RoundingMode.HALF_UP));
out.println(ONE.divide(new BigDecimal(47), new MathContext(50, RoundingMode.HALF_UP)));
out.println(ruler);
out.println(ONE.divide(new BigDecimal(6), 5, RoundingMode.HALF_UP));
out.println(ONE.divide(new BigDecimal(6), new MathContext(5, RoundingMode.HALF_UP)));
}
This is the output:
0.02127659574468085106382978723404255319148936170213 0.021276595744680851063829787234042553191489361702128 12345678901234567890123456789012345678901234567890 0.16667 0.16667
I would expect the second line of output to be the same as the first line. Is that a bug, or have I misinterpreted the BigDecimal
documentation?
JVM version:
$ java -version java version "1.7.0_45" Java(TM) SE Runtime Environment (build 1.7.0_45-b18) Java HotSpot(TM) 64-Bit Server VM (build 24.45-b08, mixed mode)
You have confused scale (total number of decimal places) with precision (number of significant digits). For numbers between -1 and 1, the precision does not count any zeroes between the decimal point and the non-zero decimal places, but the scale does.
The second argument to BigDecimal.divide
is a scale. So you get 50 decimal places for your first output.
The argument to the MathContext
constructor is a precision. So for your second output, you get 50 significant decimal places, plus one additional zero between the decimal point and the 2.
First decimal place (start counting scale from here)
↓
0.02127659574468085106382978723
↑
First significant digit (start counting precision from here)