Search code examples
javascriptnumbers

How to correctly sum up the numbers in javascript to avoid decimal logic


What is wrong with those javascript numbers ?

I'm trying to sum up earnings but js math calculation has some strange behaviour.

let earned = 0

setInterval(() => {
  earned += 0.1
  console.log(earned)
}, 1000)

The only one solution I came up for such problem, is to use round numbers instead of decimals, then divide them by 100 and use .toFixed(2) to add decimals. But I'm not quite sure if its the best way for this type of issue.

Is there any existing best practice using in JS world or I'm doing something wrong ?

let earned = 0

setInterval(() => {
  earned += 10
  
  output = earned / 100
  output = output.toFixed(2)
  output = Number(output)
  
  console.log(typeof output, output)
}, 1000)

And would I deal with 8 decimals such as 14.12345678 ?


Solution

  • This issue is caused by the way floating-point numbers are represented in JavaScript. Not all decimal numbers can be represented exactly in binary format, which is used by computers to perform calculations. This can lead to small rounding errors that accumulate over time.

    I think what you are doing is okay