Why does Javascript incorrectly evaluate the following?
0xAABBCCDD & 0xFF00FF00
In Javascript:
console.log((0xAABBCCDD & 0xFF00FF00).toString(16)) // -55ff3400
console.log((0xAABBCCDD & 0xFF00FF00) === 0xAA00CC00) // false
In C++:
cout << hex << (0xAABBCCDD & 0xFF00FF00) << endl; // 0xAA00CC00
As Pointy pointed out in his answer, javascript uses signed 32-bit values. You can use >>> 0
to indicate that the operation is to be unsigned.
console.log(((0xAABBCCDD & 0xFF00FF00) >>> 0).toString(16)) // Prints aa00cc00