{{ searchResult.published_at | date:'d MMMM yyyy' }}

Loading ...
Loading ...

Enter a search term such as “mobile analytics” or browse our content using the filters above.

No_results

That’s not only a poor Scrabble score but we also couldn’t find any results matching “”.
Check your spelling or try broadening your search.

Logo_distressed

Sorry about this, there is a problem with our search at the moment.
Please try again later.

It’s widely accepted that minimising page load speed is good practice for ecommerce, to ensure visitors get a fast service and search engines don’t mark the webpages down for usability.

But what does this mean? What are the techniques you need to use to achieve this? And if you’re not technical, how can you make sense of the jargon to make sensible decisions?

In this blog, I look at the most common causes of slow page loads speeds, which will hopefully give ecommerce managers a useful starting place.

When I started out in ecommerce, I didn't get this at all but i've spent time learning over the years to ensure that I know what questions to ask and what to look for.

You don't have to grasp the detail of the technical implications to understand the essentials and that for me is something all ecommerce managers must master.

It’s important that ecommerce managers understand the different elements of webpage design that can impact page load speed and website performance.

When working with development teams, it’s essential that someone is there to ask the basic questions that ensure obvious answers aren’t overlooked when problem solving takes place.

In my experience, responses to site performance problems can often become over-complicated and solutions appear that mask rather than resolve the underlying problems.

By understanding the basics, you ensure these are covered before any complex diagnostics are run and costs incurred to address problems.

I’ve known clients where the SI has run up significant costs for doing advanced diagnostics on a website with intermittent performance issues when the core problem could have been resolved by checking the basics.

I’ve also seen people sign-off investment in new servers where misguided advice recommended throwing hardware at what was a config or software problem.

1. Optimise images

This is always my first check. I like to use page load tools like Pingdom or Webpagetest.org and then use the waterfall view to check file sizes. Anything over 20Kb gets my attention and I’ll ask designers if there is anyway to reduce without compromising quality.

The sizes of images have significant impact on download speed of the website. However, you have to agree on an optimum balance between the size and quality of the image – whilst some images can be >20kb they are often the key brand/category/product merchandising tools and replacing them with low quality images may impede sales in other ways.

A quick win is addressing issues with thumbnail images in carousels. I often find these aren’t optimised for the web, or images are being scaled to size when the page is called. Getting the dimensions correct and providing the right size image really helps.

You should also evaluate the potential for compressing images to reduce their file size. The Google Developers diagnostic tool is useful for running checks but be careful when implementing the recommendations, always get them validated by a technical specialist.

One option to further reduce page size is to switch from baseline to progressive jpeg images but there is a potential trade-off: progressive jpeg images load the full image straight away but with only some of the pixel data, meaning the image briefly looks pixelated before sharpening.

Ecommerce teams may well have decide the slightly larger file size is preferable to a marginally diminished UX.

2. Browser caching

Setting an expiry date or a maximum age in the HTTP headers for static resources instructs the browser to load previously downloaded resources from local disk rather than over the network.

This needs to be applied across all pages with static resources and is an important way to reduce the number of HTTP requests – the more requests, then the slower the page load will be (unless you’re using multiple domains for loading assets, such as a CDN – see below).

3. Enable keep alive

Enabling HTTP Keep-Alive or HTTP persistent connections allow the same TCP connection to send and receive multiple HTTP requests, thus reducing the latency for subsequent requests.

The host domain should enable Keep-Alive for resources wherever possible. 

4. Defer JavaScript

By minimising the amount of JavaScript needed to render the page, and deferring parsing of unneeded JavaScript until it needs to be executed, you can reduce the initial load time of your page.

There is often a lot of javascript parsed during initial load. You should evaluate which javascript files can be deferred to reduce blocking of page rendering.

Be careful though, don’t defer JS files that will affect how the page performs if they’re needed upfront. Review the implications of deferment with the developers

5. Minimising scripts

A check of the website can often reveal scripts that aren’t minimised. I’ve seen this most with CSS files. There are applications available that can be used to minimise JavaScript and CSS files.

6. Remove unnecessary script

Script builds up over time even with good house keeping. It’s sensible to undertake regular reviews to identify ‘dead’ code such as redundant HTML and remove this from the page.

Removing or deferring style rules that are not used by a document avoids downloading unnecessary bytes and allow the browser to start rendering sooner. Unused CSS can easily be identified on webpages by running simple diagnostics using tools like the Google Developers diagnostic tool.

Where there are disjointed inline scripts, these should be moved into a single file wherever possible and minified. They should be moved into an external file if possible. If not they should be put under one <script> tag and moved to the bottom of the page.

7. Minimise HTTP requests

The more requests made by scripts, the longer it takes to load page content. One way to reduce requests is to convert image assets into sprites to reduce the total number of individual HTTP requests e.g. arrows in scrollers, common backgrounds like basket.

Assets that are being reused in multiple pages should all be consolidated into sprites and cached.

8. Optimise favicon

The favicon should be less than 1Kb in size and versioning should not be used, a small issue I’ve noticed on a few sites. Versioning results in downloading of the asset every time there is a deployment. It’s unlikely that the favicon will change that regularly.

9. Compression

Compression reduces response times by reducing the size of the HTTP response. Gzip is commonly used to compress resources but it is worth asking your development team to confirm which compression technique is being used on the servers.

Using the Google Developer diagnostic tool you can see resources flagged as compressible to reduce their transfer size.

10. Remove error code and duplicate scripts

If there are any errors being generated on the server these should be fixed.  Most common errors are incorrectly formed HTML tags and JavaScript errors. http://validator.w3.org can be used to validate the website for errors.

You should validate all main pages of the website using the above website and ensure that there are no errors but this is time intensive.

Taking the Homepage as the example, there are three errors being generated that need to be addressed.

You can always go to http://validator.w3.org and enter any URL you want to validate any webpage. This should be a general practice, which needs to be followed on a regular basis.

11. Remove 404 errors

(Updated following feedback from Frederic).

You need to minimise the number of 404 errors as these are dead-ends. Google analytics can be used to identify these pages. It’s important to benchmark this problem using the data available in Webmaster Tools or via other web-based software.

A 301 can be used when a page has a logical replacement e.g. catalogue structure has been updated and product X had now moved URL to Y. The 301 is an important link between old and new whilst the new URL is established. 

However, just because a page has a 404 doesn't mean it should be given a 301 to another page - the most common error is to redirect a 404 URL to the homepage when the original page is no longer relevant e.g. product has been permanently removed from the catalogue.

In such cases a 301 is not good practice because you’re sending people to an irrelevant substitute page and as these erroneous 301s build-up, it can adversely affect your SEO. Further reading at https://support.google.com/webmasters/answer/181708?hl=en&ref_topic=1724951

It’s good practice to ensure your 404 error page explains that the page is no longer available and suggest alternatives, providing links back to key pages on the website including the homepage.

Some websites provide a site search tool on their 404 to let the customer browse for relevant information.

12. Reduce DOM elements

A complex page means more bytes to download and it also means slower DOM access in JavaScript. It makes a difference if you loop through 500 or 5000 DOM elements on the page when you want to add an event handler for example. Remove any unnecessary encapsulating html elements.

However, a webpage with more DOM elements will not necessarily load slower, provided the other optimisation techniques are being used effectively.

For example, for one client I benchmarked leading ecommerce brands and found that sites like John Lewis had up to 20% more DOM elements yet faster load speeds.

13. Content Delivery Network (CDN)

A CDN is most useful when the website’s user base is located at geographically distant locations e.g. if you experience a continued increase in traffic coming from non-UK locations. It's also useful for video & image rich websites.

Using a CDN can benefit your page load speed as the resources are called from external servers, reducing the load on the origin server. However, there can often be significant costs associated.

As an example, New Look appears to be using Amplience as their CDN provider for product images and videos.

14. Other general tips

The following are good coding standards to be followed in general, as advised by some developer friends of mine.

  1. Put Style sheets at the top.
  2. Avoid CSS Expressions.
  3. Minimize the number of iframes.
  4. Choose <link> over @import in CSS files.
  5. Don’t scale images in HTML.
  6. Use GET for AJAX requests.
  7. Use smart event handlers.
  8. Reduce cookie size wherever possible.

What do you look for when evaluating page load speeds?

I’d be interested to hear from other people and understand what you look at when trying to evaluate page load speed issues, especially from the tech obsessives amongst you who can add to my explanations and probably challenge me on some of the points!

If you don’t agree with something I’ve written, please let me know, I’m always open to suggestions and advice.

James Gurd

Published 3 July, 2013 by James Gurd

James Gurd is Owner of Digital Juggler, an ecommerce and digital marketing consultancy, and a contributor to Econsultancy.He can be found on on Twitter,  LinkedIn and Google+.

49 more posts from this author

Comments (34)

Comment
No-profile-pic
Save or Cancel
Stuart McMillan

Stuart McMillan, Deputy Head of Ecommerce at Schuh

Hi James,
Great list of tips, you've just pressed my special "site speed" button, time for me to get OCD...

I went through a comprehensive site speed improvement process on my own blog to let me fully evaluate what improvements can be made. I took the speed from about 5 secs to about 0.8 secs, although that has now gone downhill a bit since I have started experimenting with the new Google user feedback tool...

Some additional tips:

Get the HTML to the browser as quickly as possible. Nothing else starts until the browser has the HTML, so look at the data start time and do everything you can to improve it. For my blog, I cached all the HTML pages *in memory* on the server, only 4 lines of code run to output a page. The data start time is 0.002 seconds, on an inexpensive server setup. It's also great for server performance, it's highly scalable.

Look at the quality of hosting, I switched to Rackspace cloud hosting, the latency of my site improved dramatically.

Part of getting the HTML quickly is to *keep the HTML as light as possible*, every byte counts. It's also a good idea to keep the DOM as shallow as possible as it speeds up the creation of the render tree.

Specify image dimensions in the HTML, this allows the browser to allocate space for the images in the layout, saves repainting once the images have downloaded.

Limit requests to 6 per domain, this will optimise parallel downloading of objects.

Base64 encode small images and add them to CSS or HTML to reduce number of objects. For an example of this, look in the source of the Tesco homepage for <link rel="stylesheet" href="/homepages/default/datauri.240.css" type="text/css" media="all" /> as an example.

Write valid HTML, arguably it renders quicker as the browser isn't having to error correct. It also feels good.

about 3 years ago

Avatar-blank-50x50

Margaret Collins

Excellent list--often, this type of article only skims the surface; you're providing some actionable advice here. We're heavy Javascript users and do try to follow much of this advice as best practice. Interesting point about 404 pages vs. redirects.

Stuart, good additional points re: HTML, though in the points both you and James make, you assume the page must be dealing with static image assets. Admittedly, even though it's been around for going on for more than a decade, dynamic imaging still isn't mainstream knowledge; but it's getting there.

Basically, dynamic imaging is the process of using an imaging server to generate all required image variants from a single master image--in this case, whatever image dimensions are required (so if all you need is a 150x150 pinkynail, that's all that will be delivered even if the master is 4,000 pixels wide).

Doing all the heavy lifting server side (instead of using the browser to scale) and only delivering the appropriate variant to the browser REALLY helps cut down on load time. It's a tremendous advantage for mobile especially.

-Margaret
LiquidPixels, Inc.

about 3 years ago

Stuart McMillan

Stuart McMillan, Deputy Head of Ecommerce at Schuh

Hi Margaret,
Not sure if you are also aware of some of the chatter in the HTML5 community about adaptive images, "srcset", <picture>, etc but hopefully it's adoption will solve some of these problems in-browser. It will allow HTML authors much greater control of images, as one may well choose a different (not just smaller) image for a different screen size, for legibility, for example.

Best,
Stuart

about 3 years ago

Avatar-blank-50x50

Geoff Paddock

James, interesting piece. Page load times are crucually important, even for non e-commerce sites, but this is one area that can literally impact on the organisation's bottom line. Readers who want to check, among other things, the size of images on any website might be interested to try the free Snapshot tool from Sitemorse. The link can be found by clicking on my name to the left.

For many developers, Sitemorse has become part of their standard set of tools.You can check use the tool to check your live pages, areas under development (and even other people’s pages) to ensure the crucial areas of performance, compliance and quality are in good shape.

about 3 years ago

Avatar-blank-50x50

Frederic

Hi,

Using 301s to redirect all 404s is simply bad practice, if you remove a page, buy a domain, or otherwise retire content and there is no equivalent content to redirect to (90% related) you should either 410 the page or let it 404. Google specifically states that if you act in the best interests of the user you will be following best practice. Sending a user from an article about the Vietnam war to a ecommerce homepage for example would be the wrong use of a 301 (real story, happened last month)

Many 301s, especially to the homepage, will hurt your seo.

about 3 years ago

Avatar-blank-50x50

Frederic

Further to my previous:

https://support.google.com/webmasters/answer/181708?hl=en&ref_topic=1724951

If you are finding excessive 404 messages in your Webmaster Toolkit then you may want to add a noindex meta tag to your 404 page to hide these. Personally I don't recommend this as seeing 404s can actually add value. Instead allow pages to be retired as 410s (expiry date, "retired" flag or some such) and put a meta tag for robots set to noindex. This way search engines will only ignore those pages that you have explicitly acknowledged as no longer being active.

Bear in mind that the robots meta tag set to noindex does not prevent the page from being crawled, it usually just prevents it from appear in the search results (it needs to crawl the page to find the meta tag). To prevent crawling completely you need to deny access to the page through your robots.txt.

about 3 years ago

Malcolm Duckett

Malcolm Duckett, CEO at Magiq

...and don't forget tagging and web analytics. The problems here fall into two categories:

1 - some sites we see have Web Analytics tags making up 30% of the page content. Today this is unnecessary - smart capture systems (from companies like Magiq and Celebrus) will capture ALL the data you would ever need via a static js file, which gets cached by the browser. This reduces load time dramatically and makes deployment simpler (note that many tag management systems do not achieve this, as they just load more tags dynamically, or embed them in-line - so they are still slowing your page, you just can't see them).

2 - make absolutely sure that page load time is not impacted by the performance of the web analytics service - there are scary examples of major global-brand sites being taken down by their web analytics service provider's own services failing, and the site owner being unable to rapidly un-tag the site. Try resetting your host files to make your web analytics system's servers invisible (or make them slow) and see what happens to your site performance - you might not like what you see!

(the same is of course true of live-chat and other external apps - you can't be too careful in this situation.)

Great post....

about 3 years ago

Avatar-blank-50x50

Ryan Murton

You even mentioned the Favicon... I can't get over that!

about 3 years ago

Avatar-blank-50x50

Gerry

Nice read James, I've had some good results with compressing CSS & JavaScript along with using Gzip as mentioned above.

Image compression I've always saved my own files but recently I've started to use jpeg mini and tiny png to add further compression to images.

Using the above I got a recent site launch from 800kb and a 800ms load time to 402kb and average uk load time of 400ms

about 3 years ago

Avatar-blank-50x50

Mike Upton, E-Commerce Manager at Demon Tweeks

Great article James that provides a good starting point of what needs to be looked at to speed sites up.

Images are often one of the biggest bandwidth hogs with any website and I found this article a while back that investigates serving 'retina' sized images with higher compression that can result in lower file sizes and still look fine on both normal resolution screens and retina screens:

http://blog.netvlies.nl/design-interactie/retina-revolution/

Worth a read for anyone looking to reduce image file sizes.

about 3 years ago

James Gurd

James Gurd, Owner at Digital JugglerSmall Business Multi-user

Morning all thanks for the comments,

@Frederic - yes you are right, apologies for a sloppy explanation. What I should have clarified is that a 301 is used when a page has a logical replacement e.g. catalogue structure has been updated and product X had now moved URL to Y. The 301 is an important link between old and new whilst the new URL is established. However, just because a page has a 404 doesn't mean it should be given a 301 to another page - the most common error is to redirect a 404 URL to the homepage when that page is no longer relevant e.g. product has been permanently removed from the catalogue. In that case you are right, a 301 is not good practice. I'll get the post updated to make the information clearer, thanks for noticing.

@Stuart - thanks Stuart for an interesting insights into your own optimisation for the blog and for the comment about getting the HTML as quickly as possible. It's interesting when looking at page load speed to pick out the different dimensions - full doc load is one but you're right that first byte is hugely important as it relates to perceived load time for the customer. The actual full doc can often load in excess of 5 secs but it's crucial that key content is visible to customers as quickly as possible so they don't perceive a performance error.

@Margaret - yes dynamic imaging can be useful, especially for mobile sites/apps. Solutions like Scene7 enable preset images that can be called via URL parameters. Stuart, I've not seen the latest HTML5 chat, can you send a link to the relevant discussions?

@Malcolm - yes excessive and untidy analytics tagging can compound the issue. What's your experience of using tag containers to help with tagging house keeping? I know a few people have gone down that route but then also experienced problems with implementing the tag management system.

Thanks
james

about 3 years ago

James Gurd

James Gurd, Owner at Digital JugglerSmall Business Multi-user

Thanks for the additional comments.

@Gerry - thanks for sharing the practical advice.

@Ryan - yes, I know it's a tiny tiny impact but my view is you should be comprehensive in your analysis because everything adds up. I've seen some crazy size favicons which add several Kb to the page.

@Mike - thanks for the article share, will take a peek.

Thanks, James

about 3 years ago

Malcolm Duckett

Malcolm Duckett, CEO at Magiq

@James, our experience is that Tag Management is really an exercise in hiding the tagging issue, rather than solving it....the tags are still there and slowing the page.

As the folk who first invented "Tag Free Capture" in the late 90's we can hardly claim an un-biased view of things, but are staggered by the amount of time, effort and money people continue to invest in tagging when other solutions are available - at one time the argument was that this type of solution provided "too much data" but in these days of "Big Data" that's not an argument that holds much weight. Our perspective was that using a single cached insert to collect everything about the users interaction with a page was cheaper in load-time, deployment and CPU overhead than Tagging (and more robust from a performance perspective) - I think experience has proved that to be true, with some of the the world's largest brands using this approach and seeing top-notch page performance ....

about 3 years ago

Stuart McMillan

Stuart McMillan, Deputy Head of Ecommerce at Schuh

@james, HTML5 discussion on adaptive images:
http://html5doctor.com/html5-adaptive-images-end-of-round-one/

about 3 years ago

Avatar-blank-50x50

Marcus Law, Marketing Manager at SLI Systems

Great tips James! In my experience at SLI Systems I realised that the average online shopper expects web pages to load in two seconds or less, and will abandon the page after only three seconds.
I would then suggest something that might sound trivial, but that actually could be useful. Regularly review the performance of your site – check from home where you may have a more typical (probably slower) Internet connection. Check then for newer versions of software (for example, Web server or e-commerce platforms) for performance enhancements.
I also think that fast performance isn’t just about the speed of the page load. It regards also the time it takes to navigate through information. Fast page loading is particularly critical when it comes to search, as site visitors tend to scan search results, instead of actually reading them line by line. What do you think?

about 3 years ago

Avatar-blank-50x50

Deri Jones, CEO at SciVisum.co.uk

Great discussion on web speed and performance which is what I do all day every day, so an issue close to my heart!

One thing I'd be interested to hear views on: - if you pass these rules to your tech teams - do you have any way to measure whether they implement them or not? Did you ask for any of these already, some whole back?

But a really important issue -and one that I see every week on eCommerce sites huge and small:
A site may have pages that in isolation are pretty fast, but when a real user follows a meaningful Journey, they see major slowness, at certian times of day, or during certian campaigns - which can be caused by loads of under-the-bonnet things like session handling, basket handling, stock handling and etc which the above -one-page-at-a-time stuff does not address.

A bit like the difference between measuing a car's engine etc on the ramps, but until Stig takes it for a test drive, you really don't know how well it will perform in the real world of corners etc!

So in terms of the ongoing management of site speed, to ensure it's not just a once-in-a-while fiddle under the bonnet: but genuine ongoing continuous improvement: is to have measurements of your meaningful multi-page User Journeys: running 24.7.

A useful mantra is that these Journeys should 'Do What the Customer Does' : so find products and put them in the basket etc.

That data, gathered 24/7 can then be put alongside your Web Analytics data, and you can start to correlate conversion drop off with performance drop off.

Now that is valuable data to make decisons round!

You have real hard data, to make a business case to the board for money to update the software or buy the new hardware you need, to implement on a big scale cool ideas like Stuart's!

about 3 years ago

Stuart McMillan

Stuart McMillan, Deputy Head of Ecommerce at Schuh

@Deri, I couldn't agree more!

On reflection, one of the best things I have done to improve our site speed at Schuh is to keep talking about it!

I've also trained the people who upload all our banner images, etc to go in to Google's PageSpeed tool in Chrome and see if any lossless compression can be applied (which the tool will provide for you).

It's also part of our weekly ecommerce reporting.

This week I have also been looking at speed consistency, so looking at the average (mean) speed in our monitoring tools, then looking at the standard deviation from that and then looking at how often the speed exceeds the mean+SD, within the core business hours of 7am - 11pm.

Speed consistency is important, so you need to come up with a standard which defines acceptable from unacceptable, then gather together any exceptions and look for patterns.

about 3 years ago

James Gurd

James Gurd, Owner at Digital JugglerSmall Business Multi-user

Afternoon all and thanks for the additional comments.

@Marcus - very good point, speed of information retrieval is important too, though harder to quantify. There are useful eye tracking tools like EyeQuant and scroll/heat mapping tools like CrazyEgg that can be used alongside web analytics to find out whether people are liking the page or not.

Search results is an interesting one and I think depends on the product type. A complex technical product may be best served by a more thorough set of info in search results because visitors need additional cues to make the right choice.

As with any webpage, it's about visitor intent and creating a UI design that makes it easy for people to get the most important information quickly.

I'm a big fan of segmented search results pages with tabs for different content types - http://search.adorebeauty.com.au/search#w=moisturiser&asug= is a nice example.

@Deri - you can easily run diagnostics using things like the Google Developers Tool to find out if the improvements have been made.

I find when working with developers that if you give them the full picture of what you're trying to achieve and why and ask for their involvement in defining the right approach, you get better buy-in then trying to impose changes on them (a bit of tech pride at stake sometimes).

And yes important to measure whole journeys as well. Many of the site monitoring tools let you set up journey flows with multiple steps and then track the speed of the entire journey + each step en route.

I'd advise starting with the most important webpages and journeys first, trying to do everything at once leads to melt down.

Of course doing page speed analysis isn't everything you should be doing. As you suggest, many other factors affect performance. A good example is database memory management - I've been in meetings where an SI has insisted the client invest in new servers and load balancing to resolve crashing issues when the traffic volumes don't justify it. The real reason is often db issues such as poor house keeping so memory gets eaten up and the db locks.

Page speed analysis is only 1 part of an overall ecommerce performance monitoring program.

@Stuart - would love to learn more about how you're doing the speed consistency analysis and reporting. That's not something I've been so closely involved in with Clients.

Thanks, James

about 3 years ago

Jon Baron

Jon Baron, Co-Founder and CEO at Electrocomponents

Further to the issues outlined by James above, and as mentioned within some of these comments, the presence of vendor tags on websites is a primary cause of delayed page loading.

TagMan has an advanced tag management system significantly improving page load times. For example, when Holland America adopted our TMS they saw a 42% increase in the speed at which their online content loaded. Prior to using TagMan, the tags on the Holland America website took 2.5 seconds to fire, contributing to a page load time of over 9 seconds – enough to make even the most patient of individuals think of reaching for the back button. With TagMan's Smart Tag Loading functionality in place, the cruise liner’s site now fully loads in almost half the time, with the tags firing in less than 1 second.

In reply to some of the concerns raised in earlier comments, tag management systems operate by removing all third party tracking tags from your website and rehousing them in a single container tag. TagMan’s TMS acts as the gateway, only firing tags onto the page as and when they are called and refusing to fire those that get stuck all together. The very purpose of a TMS is to optimise your website for both speed and analytics, so that websites owners do not have to sacrifice one in favour of the other.

about 3 years ago

Ben Seymour

Ben Seymour, Director, Professional Services at Amplience (UK) Ltd.

Hi James,
Thanks for a great article - I recently set myself the challenge to see just what performance improvement I could find in just 1 hour: http://allbs.co.uk/2013/05/19/web-performance-optimisation-hour-1/

There were 2 follow-on hours which took this site in question from PageSpeed: 79/100 –>99/100, reduced Page weight by around 80%, Page load time 60%, and got it to a 5 A's rating on WebpageTest.

This was for a relatively simple site, but my main takeaway was just how much benefit could potentially be achieved in a surprisingly small amount of time, if efforts were focused effectively.

You kindly mentioned Amplience's Interactive Merchandising solution in #13, and with regard to Image Optimisation I would also like to highlight the benefits of considering a server based Dynamic Imaging platform, such as Amplience's Dynamic Media System. This enables at-request-time resizing/compression/sharpening of image assets, which as well as providing business level controls through Transformational Templates, also facilitates Responsive Images approach to RWD (http://allbs.co.uk/2013/01/20/responsive-images-responsive-layout/) also with all assets being served via our CDN. Additionally there are some potential SEO benefits from begin able to de-couple the image in question from the apparent image request. (http://allbs.co.uk/2013/01/07/image-seo-dynamic-imaging/).

All the best,
Ben

about 3 years ago

Avatar-blank-50x50

Deri Jones, CEO at SciVisum.co.uk

Hi James

> Many of the site monitoring tools let you set up journey flows with multiple steps

It's always good to hear people's personal recommendations - which specific tools do you yourself use ?

about 3 years ago

Avatar-blank-50x50

Alberto Valle

My question is: The time indicator load Google Analytics is accurate enough to serve as a basis for actions to improve my performance?

about 3 years ago

James Gurd

James Gurd, Owner at Digital JugglerSmall Business Multi-user

Evening all,

Hope you've been enjoying the scorching sunshine.

Thanks for the additional comments. A few responses for you:

@Malcolm - one of the key benefits of tag management solutions is to use the container tag (as @Jon Baron points out) which reduces the code clutter in the page and help with page load speed. However, you need someone who understands how to implement the container tag properly to ensure all the other tags fire correctly. In my experience, done well it can really help with page speed, plus make life easier for ecommerce teams.

@Ben thanks for the extra info and the link to the blog on your own optimisation efforts - useful reading. Hope life at Amplience is going well.

@Deri - it has been a while since I reviewed the market as usually my Clients have one already in place. One of the most popular is Site Confidence. Others include Smartbear & Keynote. Monitor.us is a free tool but I've not personally used it or know anyone who has - worth taking a peek.

@Alberto - the GA time isn't an exact measure, it's a benchmark to help you monitor the trend over time to see if it's falling or rising. Plus it doesn't give you the granularity that a proper monitoring tool will do e.g. first byte timing. IMO GA isn't a site monitoring solution but you can use the page speed reports for reporting purposes, keeping an eye on this as a KPI and then doing further analysis if you see it change significantly.

Thanks
james

about 3 years ago

Avatar-blank-50x50

Deri Jones, CEO at SciVisum.co.uk

James

> @Deri - it has been a while since I reviewed the market as usually my Clients have one already in place. One of the most popular is Site Confidence. Others include Smartbear & Keynote.

I have a vested interest - my company SciVisum is in this sector.

You're right, Siteconfidence used to be a good choice, they are now part of NCC, as of a while back.
But the market has moved quite a bit in the last couple of years - many companies now want more realism, and truly dynamic users journeys: we're seeing a stream of that kind of clients move across to us. Just record and playback, or static URL replaying no longer provide meaningful data on today's complex sites.

Smartbear are not a bad chooce for marketeers as it's simple to get started, though again there's danger of simplistic journeys proving data that's not reaslistic.

Keynote - have a very powerful, but very complex tool, which is 100% self service. Some companies use it well, but you really do need a full time person to use it, to get value. And it's orders of magnitude more expensive.

This thread has been interesting, showing that there's alot people can do in-house pretty easily on page speed : but not alot of chat on measuring true multi-page User Journeys.

I wonder, is that because it's sometimes hard for marketers to get traction from technical teams on things as they get more and more 'customer realistic' and less and less 'techncially obvious'?

about 3 years ago

Malcolm Duckett

Malcolm Duckett, CEO at Magiq

@James,
It certainly seems you have hit upon a "hot topic here"... It's nice to see the community engaged in the discussion.

Having now reached "Measurement of Performance" in this discussion, there are another set of issues to be wrestled with, and those relate to "when is a page loaded".

In this age of dynamic/ajax pages and embedded applications in pages, the old approach of looking to the "page complete" flag as being the marker that signifies the page is "ready to use" often does not work.

We regularly have discussions with client's because their tool is saying "this page is slow to load", when in-fact the page is loaded and functional, but background processes (like in-page event monitoring) are still running - and will continue to run until the page is unloaded.

So, the issue forks to become on the one hand "how long is it before the page can be used" (and this may not require the loading of all the images etc.) and the point Deri is speaking of "how long does it take to complete the process the user is attempting" - and that second one can prove to have nothing to do with the web site or technology.... to take an example I was involved in -

We were working with a major UK Airline, who was concerned that their navigation was too slow because the average user spend 26 minutes on the site. Our analysis proved that users typically got the flight information they were looking for in about 3 pages and 45 seconds, but then spent the next 25 minutes looking for a better price! Sometimes it's deeper than the technology :-)

about 3 years ago

James Gurd

James Gurd, Owner at Digital JugglerSmall Business Multi-user

Hi Deri, Malcolm

Thanks for the further comments. I'm glad this has been a popular post - I did worry when I wrote it that it might be seen as the 'dull' end of ecommerce but I think it's a really exciting part of optimisation - if you get it right, it makes a big difference to customer service + can save you lots of £ on unnecessary work.

@Deri - interesting as I don't know your product and to be honest, it has been nearly a year since I last looked at a Client's monitoring tools because none of my recent projects have been in this area. I'll certainly take a peek.

I agree that multi-page paths are essential and you can use web analytics tools to identify popular paths and set these as benchmarks.

Re the reason why it's often not a focus - personally, I think it's lack of understanding from the marketing side. There isn't always a performance champion in the ecommerce team who is champing at the bit to get into the detail. Usually it's a 'we looked at the GA report and our av speed is down 10% this month, good work!' or a simple discussion with the SI/dev team and take recommendations at face value. I don't mean that to be patronising, just from my experience it's an overlooked area. If i was recruiting an ecommerce mgr, I would want them to understand the basics of technical optimisation.

@Malcolm - yes it's an important distinction you make - full page load vs. perceived load (or useful load). The critical point is how quick the action elements of the page load e.g. content is visible, tasks can be completed.

In your experience, what is the 'right' way to approach this and what advice would you give to ecommerce managers when deciding what measures to use?

Thanks, James.

about 3 years ago

Ben Seymour

Ben Seymour, Director, Professional Services at Amplience (UK) Ltd.

@James,

Far from being the dull end of eCommerce, it feels to me like the technical equivalent of getting the endorphin rush when you go to the gym :-) - a really valuable and satisfying activity to undertake, that can have directly measurable and meaningful results

about 3 years ago

Avatar-blank-50x50

Deri Jones, CEO at SciVisum.co.uk

I wonder if many eConsultancy folk would agree with James on this:

> If i was recruiting an ecommerce mgr, I would want them to understand the basics of technical optimisation.

about 3 years ago

Stuart McMillan

Stuart McMillan, Deputy Head of Ecommerce at Schuh

@Deri

It certainly helped me get my job! :)

about 3 years ago

Malcolm Duckett

Malcolm Duckett, CEO at Magiq

@James,

When we start thinking about "performance" at the application level, the good news is that you have your entire user community demonstrating every day how long a process takes - and they are the right (and only) thing to measure.

Measuring using server-side tools or measuring using robot systems are only proxies for the truth, a truth which only occurs where the real users meet your application/web site in their browser.

This is why we are such advocates of using client-side performance measurement (in the browser), and measurement of the real user community. Done properly this unearths technical, design, network and usability issues at a stroke - we have any number of examples of where this data reveals key insights, typically (as one of our user's at PCWorld once described it) "via a sequence of head-slapping moments" where the data reveals something that one feels should have been obvious from the start - but wasn't.

From a page design standpoint I think the factors you described in the original post are all correct, and I would only add the risks associated with "on-load" scripts, which can delay key page elements waiting for some non-critical component, and the dangers of linking to external sites which may delay page loading. Make sure you write to load the things the user will need FIRST, then the nice-to-have stuff (like Analytics/Chat etc.)

about 3 years ago

James Gurd

James Gurd, Owner at Digital JugglerSmall Business Multi-user

Evening all,

I'm loving this discussion.

@Ben - we really should get out more!

@Deri - good question, have posted it to Twitter to see if we can get some response. 2-3 years ago probably not seen as so important but now, with the increased commercial focus in ecom teams, I think the performance part is becoming more valued. Thankfully.

@Stuart - how did you get to your level of technical knowledge? Was it self-taught or did you learn from previous roles/managers etc?

@Malcolm - would be good to take this off line and learn a bit more about what you do. Fancy a chat sometime? If so drop me a line at james@digitaljuggler.com.

cheers
james

about 3 years ago

Stuart McMillan

Stuart McMillan, Deputy Head of Ecommerce at Schuh

@james

I've always had an interest in efficient solutions, back from when I started to program as a youngster. I then worked with a tech director who helped me improve my skills, I then tried to learn as much as i could on the subject. Having something small like my blog to work on was one of the most effective learning experiences.

about 3 years ago

James Gurd

James Gurd, Owner at Digital JugglerSmall Business Multi-user

@Stuart - Yes I should probably follow my own lessons and look to do that on my own site - always seems to be some client work to do that stops me taking it to the next level. Typical consultant:)

about 3 years ago

Avatar-blank-50x50

Beren Gamble, Ecommerce Director at AQ/AQ

Two important items which aren't mentioned are to have a site and hosting environment that perform well in the first place.

Our developers work very closely with our hosting company to optimise caching on the server side. Also, our developers have rewritten a lot of Magento's database interactions, especially for the category pages, which now display in under a second even when viewing all products.

Doing this so, reduces the TTFB (time to first byte), which none of the above advice will help.

Ideally you need to tackle the above AND hosting and the speed of database queries.

about 3 years ago

Comment
No-profile-pic
Save or Cancel
Daily_pulse_signup_wide

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Daily Pulse newsletter. Each weekday, you ll receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.