Google’s Quality Score is forcing paid search advertisers to adopt joined up thinking in PPC campaigns, writes Andrew Girdwood.
Google uses an algorithmic weighting called ‘Quality Score’ in its pay-per-click (PPC) AdWords positioning and costing.
The search engine spider AdBot, a version of Googlebot, is responsible for judging the quality of landing pages used in PPC campaigns and modifying their Quality Score appropriately.
A good Quality Score means you are rewarded with lower minimum bids and potentially a higher position.
Traditionally, it has been the search engine optimisation agency’s role to advise on what Google’s bots consider good pages or not.
These days, PPC specialist agencies and traditional media buyers have to do more than just calculate budget spread, agree keyword lists with clients and compose bidding strategies.
To get maximum value from a PPC campaign, agencies have to learn what makes a web page good in Google’s eyes.
This is one of the reasons why interest in SEO continues to grow in leaps and bounds.
There are other significant overlaps between SEO understanding and PPC performance. Last week Google opened Website Optimiser, its multi-variant testing tool, to public use.
Like many other multi-variant optimisation tools, Website Optimiser can help and hinder your search campaign. It all depends on how you use the product.
Website Optimiser is an ‘optimisation’ tool in that it helps you optimise landing pages to turn PPC traffic into conversions and leads.
For example, do you make more conversions if you use the phrase “life insurance” or “life assurance” on your landing page? Is it better to have a picture of a happy family or of an individual?
Website Optimiser lets you test the two (or more) options and find out which one sells more products for you.
Sounds good, doesn’t it? Then why was there debate in the online search engine optimisation community that Google’s own product might produce spam pages?
In my own opinion the debaters were passing around something of a stormy teacup. After all, Google is not stupid. If a site shows no other signs of search spam, has not been buying links through an agency and clearly has the Website Optimiser code on their landing pages then that is certainly detectable.
The concern over the product is fairly simple though – Google strongly dislikes pages which present one set of content to the search engine’s bots (such as the phrase ‘life assurance’) but which appear to say something else to normal users (such as switching to the phrase ‘life insurance’). This is exactly what Website Optimiser’s code does.
Some search agencies are ensuring that paid search landing pages are composed so that Google’s AdBot (the one concerned with Quality Score) can view them but Googlebot (the main spider) is told that these pages are not for inclusion in the main index. This is a good idea.
This scenario helps ensure that the paid search landing pages are not mistaken for spam pages for organic search.
There are paid search campaigns running in conjunction with multi-variant optimisation tools which are less well known to Google than their own Website Optimiser is. These pages might well be more likely to suffer a negative reaction from Google.
This sort of scripting is not one which does well in organic search engine optimisation.
Careless use of A/B testing technology for paid search campaigns on pages which have previously done well in organic search could easily cost those pages their organic search positions.
There are other good reasons why an AdWords campaign would want to use dedicated landing pages and keep those pages separate from organic search. The basic SEO issue of duplicate text is one.
Search engines want to rank the original version of an authoritative page and not a host of copies.
If you have duplicate pages within your site then you have competition from within your own site for organic search positions. Setting up a number of paid search landing pages could result in just this type of internal competition.
The problem goes away once you ensure Google’s organic search spiders know to steer clear of them but Google’s AdWords spider knows that it is welcome in.
Of course, you could just block all forms of spiders (bots, crawlers, etc) from your paid search landing pages but then your PPC campaign will suffer from terrible Quality Score issues.
There are tracking issues too. It’s not uncommon to witness websites which only tell the difference between organic traffic and paid traffic.
If you are crediting all traffic and conversions to any given landing page as PPC traffic then you have to be very sure that that particular page is not also benefiting from SEO traffic.
It may look as if a paid search campaign is doing amazingly well simply because that landing page enjoys some organic search positions and is collecting and converting organic traffic. This is true of banner and offline campaigns too.
If you have a TV advert for a particular URL and are making the assumption that all traffic to that URL is due to the TV advert… then you must make sure that search engines are not also sending traffic to particular page.
It is certainly worth noting that there are also times when you would want to drive PPC traffic to a page which is also intended for SEO.
A home page bidding strategy is one example and using paid search to promote a viral or resource, encouraging users to bookmark or link to the page, is another.
It is about control and performance. The more control you have of your online campaign then the better the performance will be. If you have limited control then you limit your performance.
As a result a wider range of agencies and their clients are having to tackle organic search engine optimisation, spider control and joined up advertising.
Andrew Girdwood is head of search at