It’s important that ecommerce managers understand the different elements of webpage design that can impact page load speed and website performance.

When working with development teams, it’s essential that someone is there to ask the basic questions that ensure obvious answers aren’t overlooked when problem solving takes place.

In my experience, responses to site performance problems can often become over-complicated and solutions appear that mask rather than resolve the underlying problems.

By understanding the basics, you ensure these are covered before any complex diagnostics are run and costs incurred to address problems.

I’ve known clients where the SI has run up significant costs for doing advanced diagnostics on a website with intermittent performance issues when the core problem could have been resolved by checking the basics.

I’ve also seen people sign-off investment in new servers where misguided advice recommended throwing hardware at what was a config or software problem.

1. Optimise images

This is always my first check. I like to use page load tools like Pingdom or Webpagetest.org and then use the waterfall view to check file sizes. Anything over 20Kb gets my attention and I’ll ask designers if there is anyway to reduce without compromising quality.

The sizes of images have significant impact on download speed of the website. However, you have to agree on an optimum balance between the size and quality of the image – whilst some images can be >20kb they are often the key brand/category/product merchandising tools and replacing them with low quality images may impede sales in other ways.

A quick win is addressing issues with thumbnail images in carousels. I often find these aren’t optimised for the web, or images are being scaled to size when the page is called. Getting the dimensions correct and providing the right size image really helps.

You should also evaluate the potential for compressing images to reduce their file size. The Google Developers diagnostic tool is useful for running checks but be careful when implementing the recommendations, always get them validated by a technical specialist.

One option to further reduce page size is to switch from baseline to progressive jpeg images but there is a potential trade-off: progressive jpeg images load the full image straight away but with only some of the pixel data, meaning the image briefly looks pixelated before sharpening.

Ecommerce teams may well have decide the slightly larger file size is preferable to a marginally diminished UX.

2. Browser caching

Setting an expiry date or a maximum age in the HTTP headers for static resources instructs the browser to load previously downloaded resources from local disk rather than over the network.

This needs to be applied across all pages with static resources and is an important way to reduce the number of HTTP requests – the more requests, then the slower the page load will be (unless you’re using multiple domains for loading assets, such as a CDN – see below).

3. Enable keep alive

Enabling HTTP Keep-Alive or HTTP persistent connections allow the same TCP connection to send and receive multiple HTTP requests, thus reducing the latency for subsequent requests.

The host domain should enable Keep-Alive for resources wherever possible. 

4. Defer JavaScript

By minimising the amount of JavaScript needed to render the page, and deferring parsing of unneeded JavaScript until it needs to be executed, you can reduce the initial load time of your page.

There is often a lot of javascript parsed during initial load. You should evaluate which javascript files can be deferred to reduce blocking of page rendering.

Be careful though, don’t defer JS files that will affect how the page performs if they’re needed upfront. Review the implications of deferment with the developers

5. Minimising scripts

A check of the website can often reveal scripts that aren’t minimised. I’ve seen this most with CSS files. There are applications available that can be used to minimise JavaScript and CSS files.

6. Remove unnecessary script

Script builds up over time even with good house keeping. It’s sensible to undertake regular reviews to identify ‘dead’ code such as redundant HTML and remove this from the page.

Removing or deferring style rules that are not used by a document avoids downloading unnecessary bytes and allow the browser to start rendering sooner. Unused CSS can easily be identified on webpages by running simple diagnostics using tools like the Google Developers diagnostic tool.

Where there are disjointed inline scripts, these should be moved into a single file wherever possible and minified. They should be moved into an external file if possible. If not they should be put under one

7. Minimise HTTP requests

The more requests made by scripts, the longer it takes to load page content. One way to reduce requests is to convert image assets into sprites to reduce the total number of individual HTTP requests e.g. arrows in scrollers, common backgrounds like basket.

Assets that are being reused in multiple pages should all be consolidated into sprites and cached.

8. Optimise favicon

The favicon should be less than 1Kb in size and versioning should not be used, a small issue I’ve noticed on a few sites. Versioning results in downloading of the asset every time there is a deployment. It’s unlikely that the favicon will change that regularly.

9. Compression

Compression reduces response times by reducing the size of the HTTP response. Gzip is commonly used to compress resources but it is worth asking your development team to confirm which compression technique is being used on the servers.

Using the Google Developer diagnostic tool you can see resources flagged as compressible to reduce their transfer size.

10. Remove error code and duplicate scripts

If there are any errors being generated on the server these should be fixed.  Most common errors are incorrectly formed HTML tags and JavaScript errors. http://validator.w3.org can be used to validate the website for errors.

You should validate all main pages of the website using the above website and ensure that there are no errors but this is time intensive.

Taking the Homepage as the example, there are three errors being generated that need to be addressed.

You can always go to http://validator.w3.org and enter any URL you want to validate any webpage. This should be a general practice, which needs to be followed on a regular basis.

11. Remove 404 errors

(Updated following feedback from Frederic).

You need to minimise the number of 404 errors as these are dead-ends. Google analytics can be used to identify these pages. It’s important to benchmark this problem using the data available in Webmaster Tools or via other web-based software.

A 301 can be used when a page has a logical replacement e.g. catalogue structure has been updated and product X had now moved URL to Y. The 301 is an important link between old and new whilst the new URL is established. 

However, just because a page has a 404 doesn't mean it should be given a 301 to another page - the most common error is to redirect a 404 URL to the homepage when the original page is no longer relevant e.g. product has been permanently removed from the catalogue.

In such cases a 301 is not good practice because you’re sending people to an irrelevant substitute page and as these erroneous 301s build-up, it can adversely affect your SEO. Further reading at https://support.google.com/webmasters/answer/181708?hl=en&ref_topic=1724951

It’s good practice to ensure your 404 error page explains that the page is no longer available and suggest alternatives, providing links back to key pages on the website including the homepage.

Some websites provide a site search tool on their 404 to let the customer browse for relevant information.

12. Reduce DOM elements

A complex page means more bytes to download and it also means slower DOM access in JavaScript. It makes a difference if you loop through 500 or 5000 DOM elements on the page when you want to add an event handler for example. Remove any unnecessary encapsulating html elements.

However, a webpage with more DOM elements will not necessarily load slower, provided the other optimisation techniques are being used effectively.

For example, for one client I benchmarked leading ecommerce brands and found that sites like John Lewis had up to 20% more DOM elements yet faster load speeds.

13. Content Delivery Network (CDN)

A CDN is most useful when the website’s user base is located at geographically distant locations e.g. if you experience a continued increase in traffic coming from non-UK locations. It's also useful for video & image rich websites.

Using a CDN can benefit your page load speed as the resources are called from external servers, reducing the load on the origin server. However, there can often be significant costs associated.

As an example, New Look appears to be using Amplience as their CDN provider for product images and videos.

14. Other general tips

The following are good coding standards to be followed in general, as advised by some developer friends of mine.

  1. Put Style sheets at the top.
  2. Avoid CSS Expressions.
  3. Minimize the number of iframes.
  4. Choose over @import in CSS files.
  5. Don’t scale images in HTML.
  6. Use GET for AJAX requests.
  7. Use smart event handlers.
  8. Reduce cookie size wherever possible.

What do you look for when evaluating page load speeds?

I’d be interested to hear from other people and understand what you look at when trying to evaluate page load speed issues, especially from the tech obsessives amongst you who can add to my explanations and probably challenge me on some of the points!

If you don’t agree with something I’ve written, please let me know, I’m always open to suggestions and advice.