We load font files using opentype.js and have found a bug either in our code, the V8 engine, or Chromium that returns the result of DataView.getInt16()
as 65536
lower or higher than it should be. This occurs very rarely (~0.25%), but that's still hundreds of times a day for our users. As a result, we can only reproduce it on a couple of our computers and not consistently. Some browser tabs will always work and others will always give the incorrect value.
I am not a binary operations expert, but know the basics.
Here's an example: Say we expect 513
.
In binary we would expect:
00000000000000000000001000000001
(513)
If the result is +65536
, we can explain this with the 17th bit being flipped:
00000000000000010000001000000001
(66049 - 65536 = 513)
If the result is -65536
, we can explain this with the full set of preceding 16 bits being flipped:
11111111111111110000001000000001
(-65023 + 65536 = 513)
It seems that sometimes, somehow, either the 17th bit gets flipped to a 1
or the entire set of bits filled in the front to satisfy the 16-bit to 32-bit conversion are flipped to 1
including the two's complement.
We've been debugging for a few days and are looking for help in how we should tackle this problem. We'd like confirm whether or not this problem is our code or something recently introduced into chromium or v8.
A guess: this could be crbug.com/1466088. The fix is already making its way through the release channels.
If that guess is right, then:
--js-flags="--maglev"
makes it much more likely to happen, starting Chrome with --js-flags="--no-maglev"
prevents it from ever happening.0b1xxxxxxx0xxxxxxx
or 0b0xxxxxxx1xxxxxxx
(where x
means "0 or 1, doesn't matter").Can you confirm any of these observations?