Although the relationship between dynamic personalisation and search engines is anything but a happy marriage, it seems to be evolving in the right direction.

Do you remember those Christmas mornings as a kid, when you rushed downstairs, dizzy with excitement and nervous in anticipation of what Santa left waiting for you under the tree?

You tore through the snowman covered wrapping paper, your fingernails peeling back the final layers to reveal (gasp!) the perfect gift.

So perfect, in fact, that it was exactly the thing that you asked for when you wrote that letter to Santa weeks ago.

Now, rewind to that same scenario on another one of those frosty Christmas mornings.

Everything is the same, the excited frenzy, the great expectations. But this time, something changes. This time when you tear away the gift wrap, you are met with a sinking feeling of disappointment and betrayal. The gift that lays before you isn’t what you wanted at all. As a matter of fact, it’s just a sweater...      

When nearly three-quarters of consumers report similar feelings of frustration after encountering generic web content, it seems that marketers aren’t doing a very good job of delivering Santa-quality site experiences.

So what is the Christmas morning equivalent of consumers getting exactly what they want in the marketing world? Dynamic content personalisation. 

The right message for the right customer

By using dynamic content personalisation brands can ensure that buyer persona groups are receiving messaging that is relevant to their interests and needs.

It has even been reported that some businesses are seeing a 19% uplift in sales as an effect of content personalisation. If this seems like a golden ticket for conversions, then why are many webmasters slow or at least cautious on the uptake?

A few years ago, Google’s stance on dynamic URLs was quite clear: don’t do it. Back then, search engines could easily get caught in an infinite loop if they followed dynamically created pages.

This was especially a problem with php scripts that can generate the same content but with different URL strings. When this happens, Google spiders find the URLs, but each one displays identical content which can lead to big penalties for a website’s SEO.  

Even as recently as 2013, Google had the following statement listed in its webmaster guidelines:

If you decide to use dynamic pages, be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few. Don’t use “&id=” as a parameter in your URLs, as we don’t include these pages in our index.

But after seeing the tangible benefits of personalisation, brands moved forward with personalisation efforts anyway because the rewards outweighed the risks to SEO.

Rich web behavioural data gave marketers the ability to segment website visitors based on parameters like their demographics, past behaviours, past purchases, and interests.

With personalisation, visitors remain on websites longer, bounce less, download more offers, and ultimately spend more money.

Dynamic web experiences move customers quickly to conversion, dramatically shortening the sales cycle.

Google’s evolving personalisation guidelines

It is a reality that dynamic pages are becoming a huge part of the internet future, and even Google has changed their tune on the matter. They issued the following official statement on their blog:

Google now indexes URLs that contain the “&id=” parameter. So if your site uses a dynamic stature that generates it, don’t worry about rewriting it -- we’ll accept it just fine as is.

Although some online resources advocate changing dynamic URLs to appear as though they are static, this often causes more problems than it solves.

While static URLs might have a slight advantage in terms of clickthrough rates, rewriting dynamic URLs to appear static is not only hard to maintain, but requires constant updates as soon as new parameters are defined.

So if long, complex dynamic URLs containing more than a couple of parameters are bad, and dynamic URLs that are altered to appear static are bad, then are we just stuck between a rock and a hard place?

Best practices for brands using dynamic content

The answer isn’t as black and white as some of us would like it to be, but Google confirms that “the decision to use database-driven websites does not imply a significant disadvantage in terms of indexing and ranking.”

Because the term “significant disadvantage” is more than a bit vague and subjective, there are a number of best practices that your website should adhere to in order to ensure optimisation:

  • Include hard-coded text links behind dynamic content will allow bots to successfully crawl and index your site
  • Submit an accurate sitemap that links to all content that may display to Google Webmaster Tools
  • By including a basic level of static content on each page and then covering it with dynamic content, you will allow Google bots to 'see' your entire website
  • Dynamic URLs don’t have to be limited on the number of parameters, but a good rule of thumb is to keep both static and dynamic URLs short
  • Make AJAX crawlable
  • Create static URLs that link to the same content as each dynamic URL.

Although the relationship between dynamic personalisation and search engines is anything but a happy marriage, it seems to be evolving in the right direction.

Personalisation is something that makes sense not only for marketers, but should also make sense for Google’s quality score and gauging page relevance.

Moving forward, as dynamic content becomes more common, expect to see search engines adapt further. Until then, vigilance in the realm of best practices is a webmasters most valuable tool.  

You can learn even more about personalisation at our two day Festival of Marketing event in November. Book your ticket today and see how you can create truly tailored approaches to your customer journeys.

Tom Dibble

Published 18 August, 2015 by Tom Dibble

Tom Dibble is the founder & CEO of Screen Pilot and a contributor to Econsultancy. 

7 more posts from this author

You might be interested in

Comments (5)

Hayden Sutherland

Hayden Sutherland, Director at Ideal Interface

Tom.
Good post, thanks for sharing.
One question I do have related to your advice about "Create static URLs that link to the same content as each dynamic URL."
Would this not create further duplicate URL's (which should really be avoided)?
Hayden

over 2 years ago

Ben Potter

Ben Potter, Director at Ben Potter - business development mentor

Hi Tom,

Good post.

I would question one of the recommendations above (assuming I have understood it correctly), where you say 'By including a basic level of static content on each page and then covering it with dynamic content, you will allow Google bots to 'see' your entire website'

Are you suggesting that the static content is hidden from the user? If so, this would be a questionable approach - reminds me of the days when people used to put white, over-optimised text on a white background!

I am sure this isn't what you mean but would be useful to clarify.

Thanks.

Ben

over 2 years ago

Tom Dibble

Tom Dibble, CEO at Screen PilotSmall Business

Hayden. Thanks for your question. Specifically, I am talking about implementing a rel=canonical which should do the trick in this situation. I should have chosen different language in that bullet point because we aren’t necessarily talking about a link per se, but an instruction to Google.

over 2 years ago

Tom Dibble

Tom Dibble, CEO at Screen PilotSmall Business

Good question Ben. Let me clarify. By loading static content first, I mean including a basic level of static-loading content on each page such as navigation links or comments, not the actual bulk of your page. This way as you layer over your static with your dynamic and even personlaised content, Google will still be able to clearly navigate the structure of your site if so desired. I never advocate trying to pull the wool over the “eyes” of search bots because as the industry has seen in the past, this only leads to trouble and penalties further down the road.

over 2 years ago

Hayden Sutherland

Hayden Sutherland, Director at Ideal Interface

@Tim - I agree that you should implement a "rel=canonical" tag to ensure a specific page gets all the 'juice'. The only further issue with duplicate URLs to the same content is that some SEO tools will not recognise Canonical an still flag this up as a duplicate.
Thanks
Hayden

over 2 years ago

Comment
No-profile-pic
Save or Cancel
Daily_pulse_signup_wide

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Digital Pulse newsletter. You will receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.