Search code examples
linuxapachecompressiongzipdeflate

How to pre-compress very large html files


I need to pre-compress some very large html/xml/json files (large data dumps) using either gzip or deflate. I never want to serve the files uncompressed. They are so large and repetitive that compression will probably work very very well, and while some older browsers cannot support decompression, my typical customers will not be using them (although it would be nice if I could generate some kind of 'hey you need to upgrade your browser' message)

I auto generate the files and I can easily generate .htaccess files to go along with each file type. Essentially what I want is some always on version of mod_gunzip. Because the files are large, and because I will be repeatedly serving them, I need a method that allows me to compress once, really well, on the command line.

I have found some information on this site and others about how to do this with gzip, but I wondered if someone could step me through how to do this with deflate. Bonus points for a complete answer that includes what my .htaccess file should look like, as well as the command line code I should use (GNU/Linux) to obtain optimal compression. Super bonus points for an answer that also addresses how to send "sorry no file for you" message to un-compliant browsers.

would be lovely if we could create a "precompression" tag to cover questions like this.

-FT


Solution

  • Edit: Found AddEncoding in mod_mime

    This works:

    <IfModule mod_mime.c>
     <Files "*.html.gz">
      ForceType text/html
     </Files>
     <Files "*.xml.gz">
      ForceType application/xml
     </Files>
     <Files "*.js.gz">
      ForceType application/javascript
     </Files>
     <Files "*.gz">
      AddEncoding gzip .gz
     </Files>
    </IfModule>
    

    The docs make it sound like only the AddEncoding should be needed, but I didn't get that to work.

    Also, Lighttpd's mod_compression can compress and cache (the compressed) files.