I'm using Kryo to deserialize a class originally serialized in Spark. Kryo writes all of its primitives in BigEndian format, but when I try to deserialize the values on another machine, the value is being returned as if it were LittleEndian.
Underlying method in Kryo:
public int readInt () throws KryoException {
require(4); // Does a basic positionality check that passes in this case
byte[] buffer = this.buffer;
int p = this.position;
this.position = p + 4;
return buffer[p] & 0xFF //
| (buffer[p + 1] & 0xFF) << 8 //
| (buffer[p + 2] & 0xFF) << 16 //
| (buffer[p + 3] & 0xFF) << 24;
}
This returns the value 0x70000000. But when my program (in Scala) uses Kryo's readByte
method:
public byte readByte () throws KryoException {
if (position == limit) require(1);
return buffer[position++];
}
and reads the bytes individually, like this:
val a = input.readByte()
val b = input.readByte()
val c = input.readByte()
val d = input.readByte()
val x = (a & 0xFF) << 24 | (b & 0xFF) << 16 | (c & 0xFF) << 8 | d & 0xFF
Then I get 0x70 for x. I don't understand what's happening here. Is it some kind of conversion issue between Scala and Java, or something to do with Kryo and the underling byte array?
The code you wrote:
val a = input.readByte()
val b = input.readByte()
val c = input.readByte()
val d = input.readByte()
val x = (a & 0xFF) << 24 | (b & 0xFF) << 16 | (c & 0xFF) << 8 | d & 0xFF
is converting bytes to int in the wrong way. If you closely inspect the readInt() method you'll see that you've switched the order.
val x = (a & 0xFF) | (b & 0xFF) << 8 | (c & 0xFF) << 16 | d & 0xFF << 24;
would be the correct way to write this.