As familiar names like HMV and Blockbuster disappear from the High Street, web traffic can be expected to grow as a result.

However, the increasing numbers of data aggregators and tracking tags being placed on websites are leading to slower loading pages, while advances in technology designed to save people time have made us less tolerant of waiting. 

In 2006, the average web user expected pages to load in four seconds or less. By 2010, that expectation had become two seconds or less.

Today estimates vary, but most studies conclude that consumers believe websites should load with little to no delay, with 57% of users abandoning a page if it doesn’t load in three seconds or less, while 67% of UK consumers will quit online purchases as a result.

Worse still, other studies have shown that every one second of delay causes a 7% decrease in conversions, leading to billions of dollars in lost revenue each year solely to slow websites.

Given the potential revenue loss that slow pages represent, the expectation is that page load time, or latency would be constantly improving. However, this isn’t the case – in the two years from 2010 to 2012, page load times across major sites increased by 48% and this number is growing. 

In fact, average page loads for large publisher and brand sites now exceed an average of three seconds. The problem is just as bad when using a mobile device. In fact, when we looked at popular mobile websites, we found an average page latency of 1.68 seconds in January, and this increased to 1.74 seconds in just one month.

When initially trying to optimise page speeds, companies focused on issues associated with delivering content, such as the size of the page or structure. While large pages – those with a lot of scrolling content or complex structures – were initially difficult to optimise, these early problems have largely been assessed and fixed.

This resulted in an average page load of eight seconds in 2002 becoming an average of two seconds in 2010.

But there is a separate, significant contributor to slow page loads that has little to do with the content or the size of the webpage: tags.

Companies implement tags for many reasons. Some are served to provide analytics or display social widgets or video elements, while others are used to make serving adverts possible.

In order for adverts to appear on publisher and ecommerce pages, a signal called a script or tag is embedded into the page. When a webpage is loaded, this element causes the browser to ping an outside server and retrieve the advert.

Sometimes several calls are made for one advert to a host of companies who may be bidding for that ad spot and thus creating a “chain” of calls from the page. Individually, these tags may not slow down a page, but taken in aggregate, especially when there are many of them, these elements can cause considerable delays.

The average website now hosts over 75 tracking elements per domain, up from only 25 in 2003. Over 20 elements per page on average equates to at least a two-second delay, often more. As elements increase, so does page load time.  

Due to the evolution of advertising technologies and analytics services, many publisher and ecommerce sites added these tags in a haphazard fashion over time and by multiple, disparate departments.

This produced several unintended results, all of which contribute to slow web pages.

  • The first is an inefficient tag structure, where calls are made to a single company multiple times, or several companies are allowed onto a page, yet perform the same function. Each of the excessive calls should be examined and removed when possible.
  • Second, the cumulative nature of adding tags can create long tag chains, where fast elements sit behind slower ones, forced to wait for their slower partners to appear on the page.
  • Lastly, as tag structure and volume is a relatively recent area of focus, many publishers and ecommerce sites do not have a particular person or group of individuals monitoring their placement and performance.  

Each of these technologies associated with a given tag provide a valuable service to the publisher. It is possible to have a fast website that takes advantage of tags so long as steps are taken to improve performance.

Companies need to take stock and investigate which elements are found across their pages. They then need to discover how each of these vendors came to their pages and any third-party tags they may be bringing with them.

Once these are known, publishers can then assign their improvement and control to a given department or individual to ensure that performance is upgraded and maintained.

If companies don’t get their websites in order and the problem becomes worse, experiencing the slow days of dial-up is a very real possibility indeed.

Image credit: kewl via Flickr

Todd Ruback

Published 8 May, 2013 by Todd Ruback

Amy King is Vice President, Product Marketing at Evidon and a contributor to Econsultancy. 

3 more posts from this author

You might be interested in

Comments (2)

Malcolm Duckett

Malcolm Duckett, CEO at Magiq

Amy you're soooo right, we are amazed at some of the poor page load times we measure on sites, and the rise of "tag management systems" are just making this worse, as they allow companies to embed more and more tracking and content management tags into the pages.

...and you don't need to suffer this delay at all - just look for current generation systems that use a single tag to collect a complete record of the visitor's behaviour and allow tag-free content management and your site's performance will be transformed.

At Magiq we got obsessive about page load performance, and today Gomez says our home page took 0.77 seconds to download, not too shabby :-)

about 5 years ago


Mike Foskett

Great article which perfectly outlines my performance fears of adding more and more tagging to the Tesco homepage.

Currently, by the "Gomez" example, the page content loads in 0.275 seconds.

The analytics are added post-load and take the total time up to 0.596 seconds. That's more than double but, as it's post-load, it is completely imperceptible to the user.

We are currently investigating the use of Adobe's Test & Target to switch in personalised content. Something I fear may add seconds to an otherwise excellent page speed.

Fantastic article which I'll be quoting from in the weeks to come, thanks Amy.

about 5 years ago

Save or Cancel

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Digital Pulse newsletter. You will receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.