I have this rust code
pub const fn decompose(result: i64) -> (bool, u64, u64) {
let ptr = (result >> 32) as u64;
let len = ((result << 32) >> 48) as u64;
let success = ((result << 63) >> 63) == 0;
(success, ptr, len)
}
Its just simple bit shift(s), assuming result is never negative.
This is my implementation in ts
function decompose(result: bigint): CallResult {
let ptr = result >> 32n;
let len = (result << 32n) >> 48n;
let success = (result << 63n) >> 63n == 0n;
return {
ptr: ptr,
len: len,
status: success,
};
}
for a int 844562369609728
they both give different results, the typescript giving the wrong result. Why is that?
rust playground: https://play.rust-lang.org/?version=stable&mode=debug&edition=2021&gist=e1b421f867f9cb6e9b6af03d6ca835bc
typescript playground: https://www.mycompiler.io/view/48cssguAXa6
According to https://www.assemblyscript.org/types.html
an i64 is a bigint in js. So this should work right?
Why is that? Because BigInt
doesn't truncate to 64 bit after shifting, while i64 does.
If you truncate manually (e.g. by bit-anding with ((1n << 64n) - 1n)
), you'll get the same result for your example input
let ptr = result >> 32n;
let len = ((result << 32n) & ((1n << 64n) - 1n)) >> 48n;
let success = ((result << 63n) & ((1n << 64n) - 1n)) >> 63n == 0n;
That being said, I very much agree with @Brian61354270 that shifting around signed integers is playing with fire, and I have no idea whether this will work for all inputs.