Search code examples
javascriptdecimalscaleaxis-labels

How do I check the number of move right decimal given variable input?


I am doing an auto scaling for my data for mulitple y-axis based on yMin value.

I have a set of input data for multiple yAxis as per below:

yAxis1: data set 1 , the yMin is 0.118234 --> Hence, my minScale for yAxis1 will be 0.1

yAxis2: data set 2 , the yMin is 0.011823 --> Hence, my minScale for yAxis2 will be 0.01

yAxis3: data set 3 , the yMin is 0.001182 --> Hence, my minScale for yAxis3 will be 0.001

Obviously, the minScale is depended on the number of move right decimal and perform Math.pow(10, -1*decimal);

But how could I determine the number of move right decimal? Any help?


Solution

  • var decimal = -Math.floor(Math.log(yMin) / Math.log(10));