I've run into an interesting conversation with a co-worker, in some libraries' documentation you can still find instructions to import them like this:
const a = require('b/a');
However, I usually use destructuring like this:
const { a } = require('b');
The question is: which of the methods is less efficient. According to my logic node will have to analyze the file in both to get access to a specific function anyway so there isn't any difference efficiency-wise. Am I correct?
The way require it works is simple. It goes to the file specified and executes everything and then returns the module.exports for the "module" that called the require for the first time and caches its response for the next subsequent call.
So in your case let's assume that you are calling b.js
//b.js
const a = require ('a')
const c = require ('c')
module.exports = {a,c}
b.js, a.js and c.js will be executed.
Or you could do only and directly require a.js;
//a.js
//no requires
const a = someFunction();
modules.exports = a;
In that simple case it is obvious a call to "a" module it would be faster. But that assuming you are only using the a module of the dependency you are installing and that is almost never the case. Either way if calling the library entirely it will be cached throughout your application and subsequent calls would be faster. So it's also dependent on the way you application is designed.
If you are going to use any of the others modules of the lib afterwards, caching on one go (in the app initialization) should be the way to go. Which means, requiring the whole dependency.
Also I am not considering circular dependency (case where requiring a.js also requires b.js) because it is considered in most cases poor designed code (even with Node.js supporting it) and if that is happening and you do not know calling a.js or b.js will not matter because they will get called (despite your efforts for optimization) and cached anyway.