I've got the following expression in Android Java:
myVar + " == 0 ? " + BigDecimal.ZERO.equals(myVar)
which outputs the following:
0.0000 == 0 ? false
Where myVar
is declared as:
public BigDecimal myVar;
And assigned successfully with Gson
, from served JSON data (I inspected this to verify the JSON was good).
Why does 0.0000
as a BigDecimal
not equal BigDecimal.ZERO
?
From the documentation:
Returns true if x is a BigDecimal instance and if this instance is equal to this big decimal. Two big decimals are equal if their unscaled value and their scale is equal. For example, 1.0 (10*10-1) is not equal to 1.00 (100*10-2). Similarly, zero instances are not equal if their scale differs.
Why did they decide this? Probably because its easier to implement. Is it a good decision? I don't think so, but it is what it is. Use another library if you don't like it.