There’s more to life than links you know

Google is evolving its algorithm to reduce its reliance on links, to avoid gaming, while providing better results. 

What else does it (or might it) take notice of? Content. Social. Personal preferences. Behavioural data. Devices. Location. On-page factors. Search intent. All of these things are likely to play a much bigger role in the future. 

User experience signals will also help Google to rank sites. How do we know this? Because they are already being used. 

A few years ago Matt Cutts started to talk about the importance of site speed. Slow sites are not good for rankings.  

More recently Yoshikiyo Kato, a software engineer on Google’s Mobile Search team, explained that there would be “changes in rankings of smartphone search results”. Sites with crappy user experiences are not likely to rank well, it seems. 

Google is not just interested in making the mobile web a better place. It is looking at user experience factors more broadly, and regularly publishes guidance in this area. 

Keep in mind that a lot of the individual elements that make up the user experience can be identified automatically (or manually, for that matter), as we shall see. 

Let’s start with site speed…

Site speed

Google has a range of testing tools to evaluate the user experience on a website. One such tool is PageSpeed Insights, which you can use to identify the things you can do to boost response times. Give it a whirl.

The tool suggests various areas that are ripe for improvement, all aimed at helping you to make your web pages more lightweight. It isn’t much of a leap to imagine that Google’s search algorithm also takes notice of these things. 

It stands to reason that a slow site can lead to a high bounce rate, as nobody likes to wait around for a heavy page or tardy server to creak into action. Slow sites (and high bounce rates) are bad for rankings: in last year’s list of SEO ranking factors on Moz ‘Response Time of Page in Seconds’ was at the bottom of the list, with a negative score (a slow response is negatively correlated to top rankings). 

This made me think, as we have recently improved the speed of our own site. Let’s see how that correlates with our search performance.

Econsultancy’s site speed & SEO

In December we unified our URLs, dispensing with a previous approach that had more than 240 countries represented. Basically, if you visited from the US you’d see /us/ in the URL. If you visited from Australia you’d see /au/ in the URL. And so on. This was, with the benefit of hindsight, for various reasons, a bad idea. 

We hired Rishi Lakhani to undertake a technical SEO audit, and his recommendation – in line with our own thinking – was to reintroduce a single URL per page. SEO and server load were two primary drivers. It obviously makes lots of sense to have all inbound links pointing to a single page, and having a faster site is a very pleasant side effect. 

When we made the change, reverting to one URL, we only needed to cache one version of a page, rather than more than 240. The knock on effect is that pages are delivered in about half the time, as this chart illustrates…


Now let’s look at our search referrals. 

In the last three weeks of January we attracted 92k, 96k and 99k visitors from Google. The most we ever did in our best week in 2013 was 86k. A 10-20% rise is not to be sniffed at…

Is it just a coincidence that our search referrals have improved? Or does site speed have something to do with it? 

It can be difficult to accurately say what causes a bump in search rankings, and site speed may have only played a small part in this recent uplift. But if it helps a little, it still helps, and I wonder what the cumulative effect might be if we can identify and execute lots of other minor (positive) ranking factors? 

So what else might help a little? What other user experience ranking signals might Google take note of, either now or in the future? Let’s explore the possibilities…

The overall mobile user experience

As mentioned above, lousy mobile experiences will lead to lousy mobile search positions. This is one reason why we’re working hard on a responsive site, which I hope will be ready to roll next month.

Google has tons of recommendations for developers and is clearly taking mobile very seriously indeed.

One look at what the PageSpeeds Insights tool shows you how easy it is for Google to figure out what’s right or wrong about your mobile site. 


Button sizes

Google knows how about the specifics too. It knows how big your buttons are. This is important for touchscreen devices, as buttons need to be big enough to be able to be tapped. After all, fingertips are bigger than mouse pointers. Increase the size of your ‘tap targets’ to avoid poor usability. 


Third party delays

Google explicitly suggests that you should load your main content first, so if you have a bunch of third party widgets to load then you should definitely think twice before prioritising them. 

Think about all of those times when you have visited a news site and end up twiddling your thumbs while some slow-ass ad server kicks into life, to gradually show you some of those lovely banner ads. That always, always sucks. 

The rule is: content first, ads and other third party widgets second. Keep an eye on things like social sharing buttons (which we’ve found to be outrageously slow), as well as anything in the sidebar. 


There are lots of automated tools on the web to test the readability of your website, including areas such as comprehension (Flesch/Kincaid scoring, etc), grammatical accuracy (beware typos, etc), and also legibility (font sizes, font weights, contrast, responsive typography). Google can test these things too.



There are all manner of automated tests for you to use to see how accessible your website is (or isn’t). Figuring out this stuff will also be a cinch for Google. 

Broken links and 404s

Bad for users, bad for SEO, and yet very easy to spot (Webmaster Tools will help you).

This is already a negative ranking factor, and won’t become a positive one anytime soon.

Infinite scroll + footer = frustration

Chasing a footer down the page forevermore is very annoying. If you have important information or links in your footer then it is unforgivable.

Expect Google to narrow its eyes, if it spots this happening on your website.

Load pages / icons

Unfortunately this is a trend on the rise, as UX-dodging designers go mad with HTML5. Heavy pages are a heavy bummer, man, and besides, the eight-second rule is surely more like one or two seconds these days?

This kind of pre-content ‘please wait’ messaging and piss-taking extends to splash screens, interstitials, pagination, pop-ups, ‘thoughts of the day’, and other related rubbish. Burn all of this with thermite. 

At some point in the future Google will definitely* issue the harshest of penalties on websites that continue to inflict these awful things on their unwitting visitors.


Good-looking content = good user experience

Sub-headers. Wiggly margins. Bullet points. Formatting. Links. All of these things matter, and they can help break your page up into digestible chunks, to make the reading experience more user-friendly

I’d be surprised if Google doesn’t already use in-page formatting as some kind of minor ranking signal, particularly for long-form content. 

Social signals

You might think that social signals should not form part of this exploration of possible UX ranking factors, but I think that social proof is a very important indicator for Google, when trying to determine how user friendly a website is. 

People tend to avoid sharing and recommend content that is annoying to consume and digest (I make a point of not doing so, and I’m not alone). 

Consider the Moz ranking factors again. Social signals correlate highly with prominent search rankings. I’m quite sure that some of my own articles rank well due in part to spikes in social activity. 

As a sidenote, I’m also convinced that not all social shares are equal, and that things like the quality and uniqueness of a (re)tweet are very important. Google’s technology will be able to make sense of this more easily than a human can. As with links, volume matters, but so does quality.


Google has just issued some rather timely guidance about faceted navigation and SEO, outlining approaches that are “ideal for searchers and Google Search”. Proof positive that navigation is a ranking factor.

Consider how your navigation works, how it is labelled, and how it is presented. 

A bigger step would be to undertake an audit of your information architecture. There’s a great article on Distilled about how to map out your IA.

In summary

User experience should always be at the top of the agenda, but the reality is that business goals / rules sometimes get in the way. However, if organic search matters to your business as much as it does to ours, and if you believe – as I do – that UX ranking signals will become stronger, then you can not afford to avoid optimising and iterating your website. 

So, in short…

  • The web is getting faster. Your site needs to match (or improve on) this trend. 
  • Smartphone usage is at an all-time high. It will continue to grow in the months and years ahead. Google is planning for a mobile future, and so should you.
  • Lots of minor ranking signals add up. Count the pennies and the pounds will watch themselves.
  • Google isn’t the only search engine that is focused on the user experience. Here are a bunch of excellent tips from Yahoo on how to improve site performance.

There are no doubt plenty of things that I haven’t explored, but this article is already well into TL;DR territory. Congratulations if you’ve made it this far, and please leave your own thoughts on this subject in the comments area below.