I have two conditions that validate if "a" equals zero and "b" less than 5, both behave in the same way, but with different operators, is there any real difference between them?
var a = 0;
var b = 3;
if (!a & (b < 5)) {
}
if (!a && b < 5) {
}
The key difference is that the & operator is a bitwise operator, while the && operator is a logical operator.
Bitwise operators work on bits and perform "bit by bit" operations, they are applied to the bits of one or two operands. The & represents a bitwise AND operation- where A and B represent two inputs; both inputs must be true in order for C to be true.
So for instance in the example you provided you have:
if(0 & 1){
}
The result of the bitwise AND on 0 and 1 as inputs is 0 (false) because both inputs must be true for the output to be true.
The && operator is a logical operator, which is used to make a decision based on multiple conditions. It can apply to one or two operands each of which may be true or false.
if (!a && b < 5) {
}
The above still evaluates to false because both conditions must be met for the code inside the if statement to be executed. In addition to this, using the && operator will (depending on the language) short circuit, which means that the second condition of the if will only be evaluated if the outcome is not determined by the first condition.
So this expression will fail as soon as a is found to be zero (false) because there is no point in evaluating the second expression as the first expression had to be true. This is another difference as the bitwise & operator does not short circuit.