I was perusing through some of the data provided by Google Webmaster Tools when I came upon the Site Performance section. this told me that doogal.co.uk took longer to load than 96% of sites on the web, which was a little embarrassing. I knew what was causing at least some of this slowdown, the incomplete list of UK postcodes which has been getting gradually slower as the number of postcodes has got longer and longer. I’d already started to address that by paging the data, but the CSV data couldn’t really be paged, without reducing the value of it. So I had a look at the suggestion provided by Google, which was to use Gzip compression on the page. I’ve not really thought much about GZip compression before, assuming if it was so useful it would be on by default but I thought I’d give it a go anyway. So I fired up Fiddler and tried downloading my big CSV page and it took approximately 9.5 seconds to fully download. I arrived at this figure as an average after several reloads.
Next I added this line to my PHP source file, ob_start("ob_gzhandler");, and measured the difference. It now took 3.7 seconds to download, wow! A lot of gain for very little effort.
At this point, I wondered if I could add Gzip compression to all my pages. It turns out this is straightforward, just add the following to the .htaccess file
php_flag zlib.output_compression on
After adding this, the CSV page now arrived in 2.7 seconds. I’m not sure why this is even faster, but I’m not complaining. Now the rest of the site feels much snappier as well. So what am I missing? Is there a reason GZip compression isn’t on by default? Am I going to get bitten by this at some point in the future?