Site optimisation

Since I discovered the Google PageSpeed Insights metrics I’ve been tweaking my blog output to maximise the score. Recent design changes mean I’m now scoring very well on these tests.

This has been made much easier by DmSite which can regenerate the entire site when I make a change. The use of DmSite also means I can choose which optimisations to apply to a particular page, as these optimisations do come with a resource cost. Currently, I’m using full optimisation on my home page and all blog posts, but in the future I can be more selective, perhaps only fully optimising the most recent pages.

I’m going to run through the site’s performance scores from a few benchmarks. In each case I tested with my recent “An unexpected GH3” blog post. As this features lots of images, it’s one of my heavier pages.

Google PageSpeed Insights

Google PageSpeed scores

Google PageSpeed Insights has been my main test for several months, and its reporting has driven a lot of the functional and design changes on the site.

On the performance side I was already inlining CSS and JavaScript files, optimising JPEG images, delaying JavaScript loads and using site-wide compression and setting cache expiry flags. I’ve now enabled PNG image optimisation, but the biggest change is to optimise externally hosted images.

The issue existed where I was linking to an image on my SmugMug site. Particularly for photography blogs I often have an image which embeds the SmugMug small or medium-sized version and links to the full size (multiple MB) original. Referencing both on SmugMug saves me bandwidth and host space[1], but also keeps the versions in sync and saves me manually having to copy things around. However, the SmugMug copies are not optimised for size.

I’ve added new code to DmSite so that for externally linked images it takes a local copy and modifies the image src accordingly[2]. This local copy is optimised with jpegtran as normal. This obviously puts more strain on my own site, but I can turn this option off and re-sync the site if I get Slashdotted, or similar.

The usability measures have been very helpful as well, particularly for mobile. Most of the changes I’ve made were discussed previously in “A more responsive blog design”.

The remaining work was mostly around links, and in particular avoiding clustering. At the top of the page, the full site trail and post navigation would wrap on a small screen, making it hard to hit the right link with fingers:

Screenshot of wrapped links in a browser

The solution was to use CSS media selectors to show a simplified navigation section on smaller displays:

Screenshot of mobile style links in a browser

The footer of the page has had a cleanup. A lot of the text links have been replaced by icons which present larger targets and can be spaced out without looking too weird.


GTmetrix scores

The GTmetrix test added a few new criteria, and tougher thresholds compared to Google.

The things changed due to these tests are:

  • Setting UTF8 as the content type in the HTTP headers
  • Inlining the icon images as base64 encoding
  • Fixed a dead link referenced in CSS

On the thresholds, GTmetrix reported minimisation issues with the HTML (4% or 490 whole bytes overweight), as well as the Google Analytics and JQuery JavaScript. It was also unhappy with the short cache expiry time of the Google Analytics file. I might look at the HTML issue eventually, the other things are outside of my control.

The other aspect I’m losing marks for is not using a Content Delivery Network. Using a CDN for this site would be somewhat excessive.

The biggest issue I had with GTmetrix is that it doesn’t seem to understand the use of CSS to set image dimensions, and instead insists on width and height attributes which is old-fashioned.


The Pingdom checker is mostly happy with the site, marking me down only for the short expiry time on the Google Analytics JavaScript. The Pingdom test didn’t highlight anything that the other tests didn’t.


I’ve done almost all I intend to do on this. There is a little HTML optimisation still there but it makes no odds in practice. I would like to make the CSS inlining smarter so it only inlines the rules actually used by the page.

I’m not going to fix the issues with externally referenced JavaScript. I could rig something up to take a local copy and optimise that, but it would be counter-productive. By loading these files from Google directly there’s a good chance the reader will have the files in their cache already, short expiry or not, whilst for my optimised copy it would definitely be a new download.

  1. The hosting for is good quality, but relatively restrictive on disc space and bandwidth. I’ve had bad experiences with “unlimited everything” hosting, so would rather pay for a real level of service. This does mean if there’s a chance to throw all the big downloads somewhere else, I’ll take it. [Back]
  2. To avoid potential name clashes the image is saved as a name that encodes the original URL, e.g. [Back]