TagMan’s Paul Cook recently asked if he could borrow a popular entertainment blog I own for the purposes of running an experiment. I said yes and promptly handed over the keys to Hecklerspray.com.
Paul wanted to figure out how the placement of tags on web pages affects performance. Turns out that the results of the research are a bit of an eye opener.
I’ve interviewed him to find out more about the issues. A link to the research can be found at the foot of this Q&A…
You’ve just published a study on how tag placement affects websites. What were the key findings?
The takeaway is that the standard way to include conversion pixels in iframe container tags can cause a serious loss of conversion data.
People have assumed that third party cookies were the main reason for the discrepancies between third parties and the client’s own data, but it actually depends more on page download speed and how the tags are included in the page.
As well as losing conversion data, this means behavioural targeting pixels are not fired which further hinders the marketing optimisation process.
The second surprise was just how impatient people are. We found that on the publisher sites that took part (www.hecklerspray.com and www.askaprice.com) they appeared to lose around 10% of their traffic for every second the page took to load. If tags typically take 1/10th of a second to load then each tag would lose you around 1% of your traffic (if they are found at the top of the page).
How are tags typically added to web pages?
Where the client owns the relationship they are generally coded directly into the page at the bottom.
Some web analytics vendors suggest putting tags at the top of the page, so as to be able to measure the page download / abandonment times but this needs to be thought about because it could have a negative effect on user experience, losing traffic as a result. It’s a good demonstration of Heisenberg’s uncertainly principle, by measuring these things you are changing the result.
Where agencies own the relationship, ad serving container tag solutions from the main ad serving solutions are used. The most popular of these – Doubleclick Floodlight – writes out tags within an iframe. This method came out worst in our tests if placed at the bottom of the page.
What are the main problem areas?
Well the proliferation of tags in general and the inability of traditional content management systems to manage them is the root cause of the problem. Added to this there has been a lack of evidence as to what best practice is, so people either rely on speculation or hearsay or just don’t think about how tags should be deployed at all.
Both of the sites that took part had tags from solutions that were no longer being used. On one of the sites, tags for a free analytics tool that was no longer being used were adding 10% to the page download times.
Tag requests tend to be rushed through at the last minute, this leads to a whole host of issues and most marketers are just happy to get them on the page. The whole issues sits on the line between marketing and IT, although most of the IT people I’ve spoken to are well aware of the issues and put the tags at the bottom of the page.
How big of an issue is this? Is it a widespread problem?
It’s something that just about everyone needs to think about and it’s certainly an issue for the companies we come across.
Certainly the free container tag solutions have become prolific amongst display advertisers but how big of a problem will vary depending on their brand, value proposition and page load speed.
What are the threats to publishers and website owners, who fail to optimise tag deployment?
As data is becoming an ever more important part of the online marketing ecosystem, leaking data is leaking money.
The threats to publishers are that they are losing traffic if tags block content that users perceive they can get from elsewhere, and for those publishers who are paid for their data by behavioural targeting companies there is an even more direct loss of revenue.
Similar problems apply for clients regarding losing traffic by having too many tags in the page or tags that load slowly. Furthermore, poor data can cloud the picture of online marketing effectiveness and take the edge off their online marketing optimisation.
How do you think media buyers / advertisers will react to this? What do they need to be aware of in future?
One senior agency technology person I’ve spoken to said that they were hardly surprised and that the study explains an awful lot about discrepancies they were already aware off.
It is something of a double-edged sword for them because on the one hand it could help resolve discrepancy issues that can waste a lot of time, but on the other hand it challenges the free and easy approach.
My main recommendation is to be aware of how performance issues with client websites can affect data loss. Where appropriate they should suggest the client considers putting the tags at the top of the page body. The case study provides a good business case to start the conversation.
What is the recommended best practice in this area?
As the study was only on two sites my first recommendation would be to conduct a controlled test. For companies wanting to do this themselves then they should email us for the full report. This includes the code and methodology we used.
I would move my container tag solution to the top of the page body and put my web analytics code at the bottom. I would also set up monitoring on the container tag provider to ensure that response times are acceptable and replace it with another provider if the answer comes back negative.
Additionally, I would move any tags that are in the page currently – and which can be served within an iframe – into the container tag solution. Tags served within an iframe will not block the rest of the page from loading, so I would only need to monitor the container tag solution to ensure user experience is not being affected.
Obviously I’d just avoid all these problems by whipping all the tags off the page, putting them inside TagMan and having just one tag on the page!
Did you learn anything else from the study?
The implications of latency on page abandonment came as real surprise to me. Traffic loss per second is a key metric that every site owner needs to know as well as how likely visitors are to return subsequently. This information makes it is possible to create a sound business case for investment in the wide variety of solutions that can improve page load times (and therefore the user experience).
Marketers need access to good quality site monitoring data, not just so the traffic they are driving to the site converts but also so they know about it.
You can grab a copy of the two-page summary (PDF) from the TagMan website. A full copy of the research is available on request.