I am writing some type check functions. I want:
While writing isInteger, I noticed that isInteger(1.0) returns true. I want this to return false.
My function is like this:
function isInteger(value) {
return typeof value === 'number' && value % 1 === 0;
}
I also noticed that it return true for 1.0 because the argument 'value' is immediately converted to 1 (which I don't want).
I did this:
function isInteger(value) {
console.log(value) // It prints 1 if value is 1.0
return typeof value === 'number' && value % 1 === 0;
}
Of course, I consider the situation where I might need to use regEx:
var regEx = /^-?[0-9]+$/;
regEx.test(value)
BUT, first I need the function to actually test the value 1.0 instead of 1.
How could I do this in order to maintain the 1.0 in the function argument?
Thank you
If you type
var a = 1;
var b = 1.0;
a
and b
are stored exactly the same way. There's no difference. That's because there's only one storage type for numbers in JavaScript, it's IEEE754 double precision floating point.
If you want to keep them different, don't use the number format, you may keep them as strings for example.