Search code examples
htmlcssasp.netwebserver

Maximizing parallel downloads for website


I've been reading articles about speeding up websites by serving static content from a cookieless domain. We have an ASP.NET website with links to images/css/js like

<script type="text/javascript" src="/js/something.js"></script>

I've been testing out the static content filter from this article and it seems to work great for situations like the above. However, we also have a lot of CSS files with styles like:

background-image: url(/images/something.jpg)

The static content filter won't work for these situations. Since a lot of our image locations are defined in CSS files, is there a good way to work around this?

Whenever we have the project loaded on our local development machines, we'll obviously want all the files to be served from localhost, so we can't go hard coding all these locations.

Is there another solution out there or is there something we can change to make this work?


Solution

  • You'll need to change your CSS files. You might need to build a "deploy" script that modifies the files on the fly before your transfer them from your dev machine to the server, but you're not going to get around the fact that the complete path has to be hardcoded in the server's CSS.

    (Unless, of course, you load all your images with javascript, then modify your styles with it, etc., an approach which has its own problems)