Search code examples
javascriptif-statement

Why do both if('0'==false) and if('0') evaluate to true in JavaScript?


From what I know, the if statement in JavaScript casts the result of its condition to a Boolean, and then executes it like the following:

if(true) {
    // run this
}

if(false) {
    // do not run this
}

And that works. But If I do this:

if('0' == false) {
    // We get here, so '0' is a falsy value
}

Then I would expect this:

if('0') {
    // We don't get here, because '0' is falsy value
}

But instead, I get:

if('0') {
    // We *DO* get here, even though '0' is falsy value
}

What's happening? Apparently, if does not check if its condition is a truthy or falsy value, but does some other conversion?


Solution

  • This is just one of those "gotchas" with the == rules which are rather complex.

    The comparison x == y, where x and y are values, produces true or false. Such a comparison is performed as follows:

    (4) If Type(x) is Number and Type(y) is String, return the result of the comparison x == ToNumber(y).

    (5) If Type(x) is String and Type(y) is Number, return the result of the comparison ToNumber(x) == y.

    (6) If Type(x) is Boolean, return the result of the comparison ToNumber(x) == y.

    (7) If Type(y) is Boolean, return the result of the comparison x == ToNumber(y).

    In this case, that means that '0' == false is first coerced to '0' == 0 (by rule #7) and then, on the second pass through, it is coerced to 0 == 0 (by rule #5) which results in true.

    This particular case is somewhat tricky because of false ~> 0 instead of '0' ~> true (as what might be expected). However, '0' is itself a truth-y value and the behavior can be explained with the above rules. To have strict truthy-falsey equality in the test (which is different than a strict-equality) without implicit conversions during the equality, consider:

    !!'0' == !!false
    

    (For all values: !falsey -> true and !truthy -> false.)