Search code examples
javascriptperformanceprototypev8javascript-engine

Will a JavaScript environment eventually recover after changing the [[Prototype]] of an object?


So, I've read the MDN disclaimers and warnings, I've read a great answer on the subject, but there's still something I want to know. This question actually came from an answer I gave to another question, here.

Let's say I decide to do the dirty deed. Something that I will regret for the rest of my life. Something that will stain me with shame forever and dishonor my family name. A purposeful, deliberate ending of --

Alright, enough of that. Anyway, here it is:

let proto = Object.getPrototypeOf(Function.prototype);

Object.setPrototypeOf(Function.prototype, {
  iBetterHaveAGoodReasonForDoingThis : "Bacon!"
});

//just to prove it actually worked
let f = (function(){});
console.log(f.iBetterHaveAGoodReasonForDoingThis);

// Quick, hide the evidence!!
Object.setPrototypeOf(Function.prototype, proto);

Basically, what I did there, was change the prototype of Function.prototype, an object that impacts pretty much every piece of JavaScript code you could write. Then I changed it back.

I wanted to illustrate a big change in the prototype chain that would impact a lot of code and cause a lot of optimizations to go down the drain. I don't expect changing it back would fix anything (if anything, I expect it would make things worse performance-wise). I'd love to know if it would or wouldn't, but if it does, that wasn't my intention.

I just want to know if, after a change like this, will the JavaScript environment begin to recover and starting optimizing things again? Or will it just give up forever and run everything in deoptimized mode? Are there optimizations that will never be achieved because of this? Can I trust that, eventually, after a period of recovery, it will return to its regular state?

For context, I'm talking about engines like the most recent version of V8, not the primitive crap used by stuff like Internet Explorers. I understand the answer could be different in different systems, but I hope there is some commonality among them.


Solution

  • V8 developer here. This question does not have a simple answer.

    Most optimizations will "come back" (at the cost of spending additional CPU time, of course). For example, optimized code that had to be thrown away will eventually get recompiled.

    Some optimizations will remain disabled forever. For example, V8 skips certain checks when (and as long as) it knows that prototype chains have not been mucked with. If it sees an app modify prototype chains, it plays it safe from then on.

    To make things even more complicated, the details can and will change over time. (Which is why there's not much point in listing more specific circumstances here, sorry.)

    Background:

    There are many places in JavaScript where code might do a certain thing, which the JavaScript engine must check for, but most code doesn't do it. (Take, for example, inheriting missing elements from an array's prototype: ['a', ,'c'][1] almost always returns undefined, except if someone did Array.prototype[1] = 'b' or Object.prototype[1] = 'b'.) So when generating optimized code for a function, the engine has to decide between two options:

    (A) Always check for the thing in question (in the example: walk the array's prototype chain and check every prototype to see if it has an element at that index). Let's say executing this code will take 2 time units.

    (B) Optimistically assume that array prototypes have no elements, and skip the check (in the example: don't even look at prototypes, just return undefined). Let's say this brings execution time down to 1 time unit (twice as fast, yay!). However, in order to be correct, the engine must now keep a close eye on the prototype chains of all arrays, and if any elements show up anywhere, all code based on this assumption must be found and thrown away, at a cost of 1000 time units.

    Given this tradeoff, it makes sense that the engine at first follows the fast-but-risky strategy (B), but when that fails even just once, it switches to the safer strategy (A), in order to avoid the risk of having to pay the 1000-time-unit penalty again.

    You can argue whether "even just once" is the best threshold, or whether a site should get 2, 3, or even more free passes before giving up on (B), but that doesn't change the fundamental tradeoff.