{{ searchResult.published_at | date:'d MMMM yyyy' }}

Loading ...
Loading ...

Enter a search term such as “mobile analytics” or browse our content using the filters above.

No_results

That’s not only a poor Scrabble score but we also couldn’t find any results matching “”.
Check your spelling or try broadening your search.

Logo_distressed

Sorry about this, there is a problem with our search at the moment.
Please try again later.

Google scored a new patent this week. Does it indicate a change in direction behind the search giant’s logic or even pose yet another problem to Google’s competitors?

A key change in SEO over the years has been a move from optimising for just a few quality signals (links, a handful of relevancy based on-page tags, etc) to optimising for very many more signals. Modern SEO is multi-signal.

One of the biggest apparent differences between Bing and Google had been around the use of behavioural signals, those signals generated by human interaction on the web.

For example, if a search engine sees that someone searches for [digital marketing news], clicks on Site A but then abandons site A within seconds, returns to the search results, clicks on Site B and then stays on Site B then that would suggest that Site A failed to satisfy.

Bing has always been happy to say that behavioural signals are very important to them. I heard this first from Barney Pell, who created Powerset and who transformed MSN search to Bing, in a search conference in Spain many years ago.

He was happy to tell the audience a good way to get Bing to notice new content was to throw users at it. Afterwards I pointed out that Google typically denied looking at similar metrics.

I recall his reaction was of puzzlement: once Powerset had access to that sort of information it quickly became incredibly important to them. Dr Pell simply could not understand why Google would choose to ignore that data or how they could.

More recently, Duane Forrester, a Sr. Product Manager at Bing, suggested that behavioural signals were most important for Microsoft’s search engine, ahead of social signals.

Google does seem to have told the opposite story. In the scenario above, with the bounce from Site A to Site B, we do seem to be talking about using bounce rates to test the worthiness of a site.

In 2008, Matt Cutts, Google’s Head of Web Spam, seemed to suggest his team certainly did not look at bounce rates (of course, other areas of Search Quality might have been). In a Sphinn thread he said;

"… I’ll just say that bounce rates would be not only spammable but noisy. A search industry person recently sent me some questions about how bounce rate is done at Google and I was like "Dude, I have no idea about any things like bounce rate. Why don’t you talk to this nice Google Analytics evangelist who knows about things like bounce rate?" I just don’t even run into people talking about this in my day-to-day life."

(I understand Sphinn is due to be retired. The Sphinn link may not work if you read this post weeks after its original publication).

However, the same story taken from a different angle reads very differently. Google has been quite happy to tell the world that they do use panels of human testers in order to judge the quality of their search results. It uses these panels to measure whether proposed algorithm tweaks result in better or worse search results.

This week Google secured a patent to automate the process of evaluating search results. This patent, which was filed in 2004, describes how a behavioural benchmark is taken over a period of time and then compared to a second set of behavioural measurements over a later period of time.

Google’s intent does seem to be with replacing or assisting their human led results evaluation, rather than site evaluation. The patent opines:

"Manual evaluation of search quality can be laborious and time consuming, typically allowing for a small number of searches and search results to be evaluated to determine overall quality of a search engine."

The patent discusses metrics like watching the percentage of users in which the user selected the first result or the percentage of users that did not have a reformulated search. These sort of metrics are clearly there to measure the quality of the SERPS rather than measure an individual page.

The technique, fresh in its patent suit, does clearly talk about user behavior though. The summary of the invention discusses "another implementation consistent with the principles of the invention" and explicitly uses the phrase.

"... to record user behavior for the time period; and logic to determine a quality of the plurality of items based on the recorded user behavior and the predicted user behavior."

At other points in the patent application Google’s intent to react when behavioural studies fall below par is clear. Figure 3 in the patent is a simply flow chart that ends with;

Quality not within expected range; take remedial measures

Google’s patent, therefore, goes beyond simply measuring user behaviour to those "remedial measures" which would be a re-ranking, or a re-personalisation.

Perhaps the most interesting twist of the patent is that it discusses modelling the results based on the predicted user behaviour. In other words, if Google’s system detects that searchers will expect to see Brand Y in the top of the results after searching for [keyphrase X] that the results will change in order to give the searcher Brand Y.

The implications for search marketers are clear. If you can drive enough search volume around keywords or destinations to show Google’s learning systems that users expect to see certain recognisable brands high in the search results then you can use multi-signal search to favour your clients.

The good news for Bing and Google’s other competitors is that there are other methods for tracking user behaviour during the search process. Bing will, most likely, get to keep on using these set of quality signals as their most important ranking factor.

Pro tip: Combing through search patents is important. Bill Slawski's SEO by the Sea is many people's number one choice for alerts on relevant patents. Recommended.

Picture credit: Rachael Lovinger

Andrew Girdwood

Published 24 November, 2011 by Andrew Girdwood

Andrew Girdwood is Head of Media Technologies at Signal and a guest blogger for Econsultancy. He can be found on Twitter here.

41 more posts from this author

Comments (1)

Avatar-blank-50x50

Heather

Hi, my name is Heather! Please email me when you can, I have a question about your blog!

almost 5 years ago

Comment
No-profile-pic
Save or Cancel
Daily_pulse_signup_wide

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Daily Pulse newsletter. Each weekday, you ll receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.