I am under the impression that INT_MAX will turn on all 32 bits of an int. If I negate that and 'and' it with itself, I should be comparing all 0s with all 1s and get back false. What am I missing?
int x = INT_MAX;
x = ~x && INT_MAX;
printf("x = %d\n", x); /*Returns 1*/
x = 0;
x = ~x && INT_MAX;
printf("x = %d\n", x); /*Returns 1*/
Edit: Oh wow I was flipping the sign bit as well. Using UNIT_MAX is giving me the result I needed. Thank you, everyone!
You're working with signed ints here; ~INT_MAX == INT_MIN
(edit: for two's complement, which is what every modern processor uses), not 0. In C, all values except 0, including negatives, will evaluate to true when used in a conditional.
If you switch to unsigned types everything should work as expected since ~UINT_MAX == 0
.