Which one is the best way of comparing a BigDecimal and an int in Java : coverting the BigDecimal to int or converting int to BigDecimal ?
If you expect the BigDecimal
value to be really big (i.e. outside the range of int
values, which is -231 to 231-1) and/or to contain decimal digits, or simply want to play safe, you should convert the int
to BigDecimal
to avoid overflow / truncation errors.
Otherwise, if performance is a really big issue (which is rare), it might be better the other way around.