I am currently learning Java and have stumbled upon the usage of "BigDecimal". From what I have seen, "BigDecimal" is more precise and therefore it is recommended to use it instead of the normal Double.
My question is: Should I always use "BigDecimal" over Double because it is always better or does it have some disadvantages? And is it worth it for a beginner to switch to "BigDecimal" or is it only recommended when have you more experience with Java?
double
should be used whenever you are working with real numbers where perfect precision is not required. Here are some common examples:
float
and double
, and trigonometry is essential to most graphics workRandom.nextDouble()
), where the point isn't a specific number of digits; the priority is a real number over some specific distributionFor values like money, using double
at any point at all is a recipe for a bad time.
BigDecimal
should generally be used for money and anything else where you care about a specific number of decimal digits, but it has inferior performance and offers fewer mathematical operations.