I've got issue trying to xor 32bits of 1 (0xffff ffff). Ex: 0xffffffff^0xfff should be 0xfffff000 but in js I can't declare unsigned int thus I get -4096 or -1000 in hex.
Here is js console output:
>0xffffffff^0xfff
<-4096
>(0xffffffff^0xfff).toString(16)
<"-1000"
>0xffffffff.toString(2)
<"11111111111111111111111111111111"
>0xfff.toString(2)
<"111111111111"
It formats correctly til I do something with it. Is there way to process such case correctly?
UPD: I realized that problem is in leading bit, but still don't how to deal with it
unsigned right shift is helpful for this case
>((0xf0000000^0xf)>>>0).toString(16)
<"f000000f"