Search code examples
hexdecimal

How to represent a negative hexadecimal number?


If I have a negative decimal number, say -5 for example, and converted it to hexadecimal format. Would I be able to simply put a negative sign in front of the hexadecimal number? Or is there another way of doing it like with 2's compliment in binary?


Solution

  • There is no "one way" to represent a negative number. That said there are a number of standard ways to represent a negative number. To keep the math simple, I'm going to assume that all number use 4 bits.

    1. Use a sign bit (in binary) 0111 is 7, but 1111 is -7. (also can be done in reverse 0111 is -7 1111 is 7.
    2. Use the 1's complement. 0111 is 7, but 1000 is -7 (all bits flipped. This has the odd property of 0000 being a natural 0, but 1111 being a (negative zero) -0.
    3. Use the 2's complement. Negation is 1's complement plus one, so 0111 is 7, but 1000 + 0001 or 1001 is -7. This leverages integer overflow to maintain no negative zero. 0000 negated is 1111 + 0001 which overflows to 0000. It also has a few nice properties such that adding some number plus its negative resolves to zero, provided that both numbers can be written (there is one more negative number than a positive one). 7 + (-7) is 0111 + 1001 which overflows to 0 0000.

    You may hear the saying that the "bits mean whatever you want them to mean". This means that you can come up with any number of ways to represent anything, you just build a "map" of the bits to the values you desire. For example, here is an odd, whimsical way of representing prime numbers.

    (bits) => value
    0001 => 2
    0010 => 3
    0011 => 5
    0100 => 7
    0101 => 11
    0110 => 13
    0111 => 17
    (and so on)
    

    Such a system would be hard to do math with, but it is an example that you don't have to be constrained to a specific way of doing anything. As long as you build routines to produce the expected output from the expected input, you can make the mapping of bits to values mean what ever you want it to mean.

    This idea of meaning being a thing you impose upon the bits is important. When you start to deal with text, the "encoding" is the meaning imposed upon the bits storing text, with the same bits sometimes encoding different letters in different encoding schemes.