James Royal-Lawson


5 basic (and practical) ways to make your site load faster

Faster is better. There’s no arguing about it. Google likes faster pages, people like faster pages. Your 3G wireless connection loves leaner, slimmer pages. God damn, your shareholders love them too!

Website performance optimisation, like search engine optimisation – or any optimisation – is a fine art. It also follows the law of diminishing returns. The largest efficiently gains can be made from a handful of optimisations. Futher gains are possible but at an increasingly higher cost.

5 basic tips

Here are 5 practical ways that you can improve the load time of your web pages.

  1. Head stuff. Put CSS first, Javascript last – Javascript blocks loading the other files in the head part of your HTML. Put your CSS and Javascript in as fewer files as possible, and minify them all.
  2. Width and Height. Declare width and height on all img tags – this reduces the need for browser redrawing/reflow once images are loaded. Yes, you can set the size in CSS, but in the tag itself is marginally better and often possible for web editors to do themselves.
  3. Code efficiently. Both HTML and CSS. Deeply nested div tags, empty div tags and overqualified CSS-selectors make James cry. For many this will be the least practical suggestion in this list.
  4. Static content. Serve static content using a CDN or (for example) Amazon S3 and a separate domain (for example, I use With Amazon S3 it is very easily for any website to spread requests over 3 domains if needed. An additional advantage of a separate domain is that you can keep it cookie-free, reducing requests and bandwidth even further. Exception: serve your CSS from your main domain.
  5. Trim the fat! Have less stuff on your page! – Ground breaking advice I know, but it’s too easy to let your web pages become cluttered, especially with sliding banners, widgets, and other externally hosted scripts. Use ROPS to assess the cost of new features.

Bonus tips

It goes without saying that you should be using gzip compression and sensible caching policies on your web host – if you’re not already, then it’s likely that fixing it is probably not “simple” (or unfortunately someone somewhere isn’t up to their job).

Google’s page speed analyzer which is available as an additional to the Firefox add-on Firebug gives some excellent feedback on the state of your web site, including practical tips. It also provides optimised images and compressed scripts as ready-to-download files.

Yahoo have written up best practices with regard to improving site speed. The list includes 35 different tips, and should keep you going a fair while.

Finally, this article is a great bit of further reading and gives detailed advice but written in a way that makes it accessible for more than just the front-end web developers it targets.


If you are working on a responsive website, then congratulations! Make sure website performance optimisation is a clear part of your work. It’s more important than ever in these types of websites and has a huge impact on the user experience.

If looking for ways to optimise your website feels a little beyond you, get in touch, I’d be more than happy to help you out.

James Royal-Lawson+ is a digital strategist, optimiser and web manager based in Stockholm Sweden.

Google Analytics: Updated visit definition is missing visits

Google updated their definition of a visit in the middle of august. I’ve written an explanation in a separate blog post. In general the change is good as it should make the data in Google Analytics easier for the layman to interpret.

What isn’t so good is that Google Analytics isn’t behaving in the way Google describes. It’s not only missing visitors in some situations, but it is also missing some traffic sources – the attribution is totally incorrect for some visits.

Test details

My test was as follows:

Using my Android tablet, I visited my blog a series of 4 times. I used my tablet so that it would be easy to extract my test visits (with little chance of anyone else visiting the same pages from the same sources on that day).

Visit 1

Via a link on one of my old sites, 
The browser newly opened 
Not visited in the past 30 minutes.

Visit 2

Via Google's search results searching for beantin fishbang
A few minutes after visit 1.

Visit 3

via Google again, this time searching for beantin seo
The browser newly opened
Over 60 minutes since visit 2.

Visit 4

Via a link on another one of my old sites,
Just a few minutes after visit 3.

According to Google’s new visit definition, this should have been 4 visits, with 4 different traffic sources.

What the data contained

Detail of a screenshot

According to my Google Analytics data, I had made 3 visits. Visit 4 is missing. Instead, you can see that the beantin seo search has had 2 page views attributed to it – which you can see from my test actions simply isn’t true.

Showing all 4 visits happened

As a way of confirming that visit 4 really did happen and data was received by Google Analytics, showing the referral from, I made use of my per visit referrer script.

On this script saves the referrer for each visit as a custom variable. The script is run on each page view, and the referrer is saved to the custom variable at the visit level.

This means visit 4 will have over-written the referrer for visit 3 – as Google hasn’t trigger a new visit for visit 4, but there is a page view, so my script grabs the referrer…

Details of a screenshot from Google Analytics showing that a was a referrer

As you can see from the screenshot, is there – meaning a visit did come from that site, and there are two page views attributed to it (the page views from visits 3 and 4).

Bug or feature?

I’ve repeated this test on my laptop and examined the cookies after each visit, and Google Analytics is failing to update the traffic source (in __utmz) and subsequently failing to trigger a “new visit” according to their new definition.

A bug or a feature? I say bug… what do you think?

Update 20110915

When researching this blog post, I focused my attention on the __utmz cookie. I’ve just taken a closer look at how both __utma and __utmz are behaving in the above scenario.

Google Analytics is failing to update not only the traffic source, but also the visit count and the various timestamps stored in __utma detailing when you last and current visits took place.

This means that even more reports in Google Analytics could be affected (depending on your visitor patterns)

is a freelance web manager and strategist based in Stockholm Sweden.

Google Analytics: what are visits?

It’s all change with Visits and Google Analytics. In August 2011 Google altered when they consider a session to have ended. A small change according to their blog.

Get ready, I’m going to mention some odd sounding cookie names a fair bit!

close up photo of a Google Analytics visitor graph

In the old days

Prior to August 2011, If a user was inactive for 30 minutes or more, any future activity would have been attributed to a new visit. Any users that left your site but returned within 30 minutes were counted as part of the original visit.

Google made use of two cookies in keeping track of a session. One called __utmb and another called __utmc.

__utmc was a pure browser session cookie which expired as soon as the browser was closed. If __utmc didn’t exist, then it was a new visit.

__utmb is a persistent cookie that is set to expire after 30 minutes (by default). This cookie was used to register a new visit if you’ve left the site open in your browser (ie __utmc exists), but you disappeared for more than 30 minutes to do something else – perhaps to eat lunch.

Back to the future

From August 2011, the session cookie __umtc is no longer used to calculating visits and instead Google is using the __utmb cookie in combination with another cookie, __utmz, to determine when a new session begins.

__utmz is the traffic source cookie. I’ve explained the often misunderstood Google Analytics traffic sources in a previous post. This cookie only gets updated when the traffic source for the current visit is different to the traffic source stored in the cookie (excluding direct visits).

What Google Analytics does now is reset the __utmb cookie and increment the session counter in __utma (the 2-year persistent cookie storing your unique ID amongst other things) every time the __utmz cookie is updated – ie, each time the traffic source changes.

It does this whether the __utmc cookie exists or not. So, closing your browser, reopening it and revisiting a site (within 30 minutes of __utmb last being updated) will count as part of the same visit.

So What does this mean?

This means that you can’t compare visit data that crosses the date divide of August 16th 2011. Year-on-year comparisons are out of the window.

You will also see an increase in visits. How much of an increase depends on your traffic patterns – if visitors frequently hopped back and forth to your website from other sites or search engines in a short space of time, you’ll see a much bigger jump than say, a blog with a relatively low publishing frequency.

You will see a slight increase in traffic sources as the splitting of visits up into per-source chunks should reveal sources that were previously buried. Average page views per visit will fall slightly, and bounce rates will rise.


My research has shown that visitors re-entering a site (within 30 minutes) via a referring site (not a search engine, or a visit with campaign tracking) are not causing the __utmz cookie to be updated, and no new visit is recorded. These visits are being considered a continuation of the original visit.

If we ignore the oddity of referring sites not being recorded properly, this change is probably going to make session-based reports easier for the layman to interpret. and a step closer to seeing per-visit traffic sources out of the box.

is a freelance web manager and strategist based in Stockholm Sweden.

1 of 9
Reload this page with responsive web design DISABLED