Search code examples
javascriptgoxtea

Converting customized XTEA algorithm from JavaScript to Golang


I have currently converted the customized XTEA encryption from JavaScript code to Golang, but the Golang output is incorrect and not same as JavaScript output, here's my JavaScript source code:

function sample(e, t) {
    for (var n = 32, r = 0; 0 < n--; ) {
        e[0] += (((e[1] << 4) ^ (e[1] >> 5)) + e[1]) ^ (r + t[3 & r]);
        r += -1640531527;
        e[1] += (((e[0] << 4) ^ (e[0] >> 5)) + e[0]) ^ (r + t[(r >> 11) & 3]);
    }
}
var temp = [15, 16];
var temp_2 = [14, 15, 16, 17];
sample(temp, temp_2);
console.log(temp);

and Golang source code:

func sample(v *[2]uint32, key *[4]uint32) {
    const (
        num_rounds uint32 = 32
        delta      uint32 = 0x9E3779B9
    )
    for i, sum := uint32(0), uint32(0); i < num_rounds; i++ {
        v[0] += (((v[1] << 4) ^ (v[1] >> 5)) + v[1]) ^ (sum + key[sum&3])
        sum += delta
        v[1] += (((v[0] << 4) ^ (v[0] >> 5)) + v[0]) ^ (sum + key[(sum>>11)&3])
    }
}

I think the problem is related to Golden Ratio and conversion from JavaScript 64-bit float system that I've not applied because I didn't know how to do that exactly


Solution

  • Here is the Go implementation:

    package main
    
    import (
        "fmt"
    )
    
    func main() {
        v := [2]int64{15, 16}
        key := [4]int64{14, 15, 16, 17}
    
        sample(&v, &key)
    }
    
    func sample(v *[2]int64, key *[4]int64) {
        const (
            num_rounds       = 32
            delta      int64 = 1640531527
        )
        for i, sum := 0, int64(0); i < num_rounds; i++ {
            temp := int32(v[1])
            v[0] += int64((((temp << 4) ^ (temp >> 5)) + temp) ^ int32(sum+key[int32(sum)&3]))
            sum -= delta
            temp = int32(v[0])
            v[1] += int64((((temp << 4) ^ (temp >> 5)) + temp) ^ int32(sum+key[(int32(sum)>>11)&3]))
        }
        fmt.Println(*v)
        // Output: [6092213800 11162584543]
    }
    

    Explanation

    The safe range of a JavaScript interger is between -(2^53 - 1) and 2^53 - 1 (see Integer range for Number). And the tricky part in the JavaScript implementation is that bitwise operators always convert the operands to 32-bit integers (see Fixed-width number conversion).

    To align with the JavaScript implementation, the data types should be int64 (int32 or uint32 does not have enough space for numbers between -(2^53 - 1) and 2^53 - 1). So these variables should be declared as int64:

    • items in v
    • items in key
    • sum
    • delta

    Then before we perform bitwise operations, we convert every operand to int32.