I was expecting the following comparison to give an error:
var A = B = 0;
if(A == B == 0)
console.log(true);
else
console.log(false);
but strangely it returns false
.
Even more strangely,
console.log((A == B == 1));
returns true
.
How does this "ternary" kind of comparison work?
First, we need to understand that a ==
comparison between a number and a boolean value will result in internal type conversion of Boolean value to a number (true
becomes 1
and false
becomes 0
)
The expression you have shown is evaluated from left to right. So, first
A == B
is evaluated and the result is true
and you are comparing true
with 0. Since true
becomes 1
during comparison, 1 == 0
evaluates to false
. But when you say
console.log((A == B == 1));
A == B
is true
, which when compared with number, becomes 1
and you are comparing that with 1 again. That is why it prints true
.