I'm planning to use Laravel framework in my next project. By now i get stuck in organizing links to static content in my project (images, scripts etc.) For example, i have this cache option (do cache all static files, nginx):
location ~* \.(jpg|jpeg|gif|css|png|js|ico|html)$ {
expires max;
}
How can i make a link to file main.js? How to tell proxies that file is out of date if it was modified?
In my current projects i have versioning system wich uses file modification time to generate unique prefix to files. So link would be like http://site.com/22566223435/js/main.js And next time when file change the link change automatically.
What is the best practice in Laravel to achive this functionality? Thanks!
I would do that as a query string, it's easy, accurate enough and doesn't need any complex layers to achieve. Simply put the last modification of the file as query string on their links. That will force the browser to refresh the file.
<link rel="stylesheet" type="text/css" href="/css/style.min.css?v=<?= File::modified(path('public').'/css/style.min.css') ?>" />
To simplify this, you could create a simple class that generates links to files and caches the last modification for increased performance. It's also achievable with custom functions on LESS compiler if you use it.
Now, about server caching, if your application has that many users to really benefit from it, you should be looking into CDN's, which also handle worldwide distribution and will work wonderfully with that query string system.
Edit:
It's also possible to approach that with RewriteRules on Apache (don't have any experience with nginx to help on that tho). The same technique used to generate the query strings, you could use to generate a prefix (or suffix) to the URI.
Another thing you could try is to define a subdomain mainly for handling static assets like assets.example.com
. This domain can be entirely handled by a webserver without the Laravel stack. But it depends a lot on how your assets are developed, compiled and used across your project.
Our Approach:
On our company we use CloudFront and S3 for database entities assets. Each entity has it's own S3 directory and each asset is versioned through an unique filename (generated by md5, which avoids duplication on reuploaded assets). Something like:
/posts/876/060b90d67ac0c5e24da6de6ae547e3b1.jpg
We also defined 10 subdomains on our CloudFront, so browsers don't reach the limit of 6-8 concurrent requests for the same domain:
cdn0.example.com
cdn1.example.com
cdn2.example.com
... and so on
Each entry on our database uses an exclusive subdomain picked by calculating resource.id % 10
, this is extremely fast and always returns the same subdomain for each entity (helping with client and CloudFront caches). This is the best you can get for serving images.
UI images are stored on a exclusive subdomain assets.example.com
, and those are not versioned so far, because we don't change design that much and if we do, we would probably put the new assets inside something like a /v2/
or /newthemename/
folder. This approach helps a lot with rollbacks and even user chosen themes.
CSS and JS are served by Apache from within Laravel /public
directory. This is not the fastest way, but since we are focused on development right now, having automatic compiling of LESS and Closure is far more important. When we launch for end users, we will probably think of an automated deploy system which compiles assets, publishes them on S3/CloudFront with a timestamp prefix and caches their last timestamp for view rendering.