Search code examples
javascriptjavaoverflowint32

Int32 computation difference between Java and JavaScript


I need to rewrite some legacy Java code performing arithmetic transformations from Java to TypeScript/JavaScript. The problem is the legacy code uses the int Java type (signed 32-bits) and relies on overflows. I almost got what I want using Int32Array in JavaScript, but I still have a difference I can't explain. Look below.

Java:

int current = -1599751945;
int next = current * 0x08088405 + 1;
System.out.println("next = " + next);

Output: next = 374601940

Javascript:

const a = new Int32Array(4)

a[0] = -1599751945
a[1] = 0x08088405
a[2] = 1
a[3] = a[0]*a[1] + a[2]

console.log('a[3] = ' + a[3])

Output: a[3] = 374601952

Can someone explain the difference? And how can I get same result in JavaScript? I tried shift operations, coerce with |0, methods to convert etc., but best result is the one above.


Solution

  • Use Math.imul() in JavaScript. That should produce the correct result.

    const a = new Int32Array(4)
    
    a[0] = -1599751945
    a[1] = 0x08088405
    a[2] = 1
    a[3] = Math.imul(a[0], a[1]) + a[2]
    
    console.log('a[3] = ' + a[3])
    

    Additional details as to why can be found here.