I want to convert a hex-string to a byte-array. I thought using BigInteger
is a good idea. But for values greater than 7F
it produces unexpected results.
My code:
var bytes = new BigInteger("80", 16).toByteArray();
for (var b : bytes) System.out.println(b);
It outputs:
0
-128
Why does this produce two bytes?
I would have expected 00
to FF
to produce one byte, 0100
to FFFF
to produce two bytes, and so forth.
Side note: The first byte seems to actually matter:
new BigInteger(new byte[]{ (byte)0x80}); // produces -128 (negative!)
new BigInteger(new byte[]{ 0, (byte)0x80}); // produces 128
new BigInteger(new byte[]{0, 0, (byte)0x80}); // produces 128
Thanks for all Your contributions, comments and answers, finally I understand what is going on.
Numbers from 0 to Hex 0x7F (decimal 127) work like this:
First Bit=sign (0=+,1=-)
| Other bits=number
v vvv vvvv
0 111 1111 = 0x7F = 127
Numbers from 128 to Hex 0x7FFF (decimal 32.767) work like this:
First Bit (of first byte!)=sign
| Other bits and other bytes=number
v vvv vvvv vvvv vvvv
0 000 0000 1000 0000 = 0x80 = 128
Long story short: