How to serve compressed HTML, JavaScript and CSS
Compressing web pages saves storage and bandwidth, and speeds up websites. Here is a way to compress HTML, CSS and JavaScript pages.
Why gzip static pages?
Compressing static web pages (web pages which do not change each time
they are loaded) with gzip
(see The gzip home page) reduces the
total amount of bandwidth used serving web pages, and reduces the time
it takes a page to travel across the internet, because the file is
smaller.
This is effective for HTML, JavaScript, CSS or text files, but image formats such as JPEG or PNG files are already compressed, so there is no point in gzipping them, since the file size will not reduce by very much.
How to compress files
The gzip
utility compresses files. It's standard on Unix,
Linux, and BSD computers, and also available for Windows:
Gzip for Windows.
On the command line, gzip --keep file.html
creates a file
called file.html.gz
. (The --keep
option
stops
gzip
from deleting the original file.)
Make Apache use gzipped files
With Apache, make a site serve a gzipped version of a file by adding
the following to .htaccess
in the top directory of your
public directory:
RewriteEngine on RewriteCond %{HTTP:Accept-Encoding} gzip RewriteCond %{REQUEST_FILENAME}.gz -f RewriteRule ^(.*)$ $1.gz [L]This tells Apache to serve the file ending in
.gz
if it
is available as if it was the original file, if the requester accepts
the gzip encoding. So this assumes you have files
like index.html
and index.html.gz
or program.js
as well as program.js.gz
.
For more about .htaccess
, see Apache Tutorial: .htaccess files.
How to test whether it's working
The website HTTP Compression Test (whatsmyip.org) lets you test whether the gzipping is working OK or not. Type the URL of the static page which should be compressed into the box, and it will give you a green tick if compressed content is being sent OK.
For a more sophisticated check, Rex Swain's HTTP Viewer provides a facility
to view the exact header and content. To get the content with "gzip",
enter the word "gzip" in the Accept-Encoding
box.
To crush a lot of files
For someone with a lot of files, this script automates the process:
#!/usr/local/bin/perl use warnings; use strict; # Set $verbose to a true value if you need to debug this script. my $verbose; use File::Find; find (\&wanted, (".")); sub wanted { if (/(.*\.(?:html|css|js)$)/) { my $gz = "$_.gz"; if (-f $gz) { if (age ($_) <= age ($gz)) { if ($verbose) { print "Don't need to compress $_\n"; } return; } } if ($verbose) { print "Compressing $_\n"; } # The following substitution is for the case that the file # name contains double quotation marks ". $_ =~ s/"/\\"/g; # Now compress the file. system ("gzip --keep --best --force \"$_\""); } else { if ($verbose) { print "Rejecting $_\n"; } } } # This returns the age of the file. sub age { my ($file) = @_; my @stat = stat $file; return $stat[9]; }Run this from the top public directory. It will find and compress any file ending in
.html
, .css
,
or .js
, regardless of upper or lower case.
This article was originally written for the internal Wiki of web hosts
NearlyFreeSpeech.net. I'm also publishing it here publicly for the
benefit of non-members. If you are a member of NearlyFreeSpeech.net
web hosts, you can view this page at
https://members.nearlyfreespeech.net/wiki/HowTo/GzipStatic
.
Web links
-
Scripts for compressing a static website (github)
This project on Github includes a way to store all the website materials in compressed form and then uncompress them if the requester does not accept gzip content.