When you serve JavaScript to a page it is best to serve one packaged, minified and gzipped file to reduce latency and request times.
But is it better to send
With 1. you get load more on the initial load but then all your javascript is cached for the entire visit to your website.
With 2. you load only as much is necessary for a page so the initial load time is reduced but you don't have the same file cached on every page in your website.
Which method is preferred?
I'm using node.js
to serve my JavaScript and I'm using ender
to package my JavaScript.
Edit
Phrased differently I'm thinking of implementing an automation on my packaging. This automation will either package everything into one file for my entire website or package a page specific list of files into one file for each page.
I don't have any statistics on my JavaScript files yet but I was curious as to which of the two automations I should implement.
Option 1 is tenable.
Option 2 is a bad idea (for the reasons you specify).
You're missing option 4, which is to have a one large core package, with instance updates (secondary javascript includes) as necessary... no reason to load your Google Maps code on every page, when you only need it here and there, for instance. But there's also no reason to re-serve your 'core' packages.
This is generally the option I use. When speed is super-important, I use a subdomain which has just about everything stripped out of the apache (no sessions or cookies, php, etc). I actually have one server which acts as a central static repository for all of my clients and an extra A Name record in the DNS for 'static' using virtual domains.
Added: In response to your edit, I think the most appropriate thing is to have a list of files that need to be globbed together in your automation. Instead of taking 'everything', just take all of the items in a 'to_pacakge' array.