If we have different bundles created by webpack and we require.ensure
something to dynamically transfer+eval it at a later point in time, it happens via jsonPadding and some webpack js magic. If we have
require.ensure([ ], ( require ) => {
console.log('before...');
var data = require( './myModule.js' );
console.log('after...');
}, 'myModule')
"after..."
will get encountered when that module was entirely transferred and evaluated. If it happens to be that this chunk / module is pretty big, contains images, css and whatnot, the loading will pretty much lock down a browser while the webpack javascript code unpacks the bundle with all its components.
Question: Is there any way to "hook" into that require
magic? For instance, it would be a dream scenario to have callbacks for:
and so forth, assuming that our transferred bundle contains a lot of data. In general it just bothers me pretty hard to have a nice option to asynchronously transfer whole bundles dynamically, but still have to load that very bundle in full sync / blocking fashion.
I guess I was confused about the topic myself, so my question was probably not precise enough to get properly answered. However, my misunderstanding on the whole "commonJS dynamic module loading" context was, that require.ensure()
will just transfer the Module Code (respectively the Chunk which webpack created) over the wire. After that the transferred Chunk which basically is just one big ECMAscript file just sits there in the browser, cached but not evaluated yet. Evaluation of the entire Chunk happens only on the actual require()
call.
Having that said, it is entirely in your hands how you decouple and evaluate the individual parts of a Module / Chunk. If for example, like in my original question, a module requires()
in some CSS Files, some Images and some HTML, that all gets asynchronously transferred on the require.ensure()
call. In which manner you require()
(and therefore evaluate) those parts is entirely up to you and you can decouple those call if necessary yourself.
For instance, a Module looks like this:
Module1.js
"use strict";
import { io } from 'socket.io-client';
document.getElementById( 'foo' ).addEventListener('click', ( event ) => {
let partCSS = require( 'style/usable!./someCoolCSS.css' ),
moarCSS = require( 'style/usable!./moarCoolCSS.css' ),
tmpl = require( './myTemplate.html' ),
image1 = require( './foo.jpg' ),
image2 = require( './bar.png' );
}, false);
Of course, all these files are already contained by the Chunk which gets transferred to the client, when some other Module calls:
require.ensure([ 'Module1.js' ], ( require ) => {
}, 'Module1');
This is were my confusion was. So now, we can just play with the require()
calls within module1.js
ourself. If we really require a lot of files that way, we could even use a setTimeout
/ setImmediate
run-away-timer to decouple the synchronous evaluation between each require()
call if necessary or wanted.
Actually a long answer for a pretty simple story.
TL;DR:
"require.ensure
transfers a whole chunk over the wire. This chunk contains all files which are part of a require()
call within the ensured Module. But those files do not get automatically evaluated. That happens only when the actual require()
call is matched at runtime (which is represented by a webpackJSONP call at this point)"