Search code examples
javascriptbit-manipulationchromiumv8

Javascript V8 Torque Engine Loading Binary Data Improperly


We load font files using opentype.js and have found a bug either in our code, the V8 engine, or Chromium that returns the result of DataView.getInt16() as 65536 lower or higher than it should be. This occurs very rarely (~0.25%), but that's still hundreds of times a day for our users. As a result, we can only reproduce it on a couple of our computers and not consistently. Some browser tabs will always work and others will always give the incorrect value.

I am not a binary operations expert, but know the basics.

Here's an example: Say we expect 513.

In binary we would expect: 00000000000000000000001000000001 (513)

If the result is +65536, we can explain this with the 17th bit being flipped: 00000000000000010000001000000001 (66049 - 65536 = 513)

If the result is -65536, we can explain this with the full set of preceding 16 bits being flipped: 11111111111111110000001000000001 (-65023 + 65536 = 513)

It seems that sometimes, somehow, either the 17th bit gets flipped to a 1 or the entire set of bits filled in the front to satisfy the 16-bit to 32-bit conversion are flipped to 1 including the two's complement.

We've been debugging for a few days and are looking for help in how we should tackle this problem. We'd like confirm whether or not this problem is our code or something recently introduced into chromium or v8.


Solution

  • A guess: this could be crbug.com/1466088. The fix is already making its way through the release channels.

    If that guess is right, then:

    • This bug occurs only on arm64 hardware, e.g. Macs with M1/M2 chips, and most Android devices. It never happens on Intel/AMD CPUs.
    • This bug occurs only when the new "Maglev" optimizing compiler is enabled. Starting a fresh Chrome instance with --js-flags="--maglev" makes it much more likely to happen, starting Chrome with --js-flags="--no-maglev" prevents it from ever happening.
    • This bug occurs only when the most-significant bits of the two bytes of the value you're loading differ. That means it wouldn't happen for the 513 in your example; it would happen for values like 0b1xxxxxxx0xxxxxxx or 0b0xxxxxxx1xxxxxxx (where x means "0 or 1, doesn't matter").

    Can you confirm any of these observations?