Enter a search term such as “mobile analytics” or browse our content using the filters above.
That’s not only a poor Scrabble score but we also couldn’t find any results matching
Check your spelling or try broadening your search.
Sorry about this, there is a problem with our search at the moment.
Please try again later.
The restriction of keyword referral data has had a major impact on SEO, with marketers seeing it as the second biggest obstacle to search success.
An inability to see this data has prevented marketers from optimising their organic search campaigns as they had done in the past.
Our UK Search Engine Marketing Benchmark Report 2014, produced in association with Latitude, has surveyed companies and agencies about their attitudes to the almost total loss of organic keyword data.
A potted history of (not provided)
Let's recap briefly. Back in October 2011, Google announced that it was encrypting searches for logged in Google users (of which there are many), under the banner of making search more secure.
Understandably, this was met with consternation by search marketers and ecommerce professionals, as it made their jobs much more difficult.
The fact that such data was not restricted for those spending money on paid search naturally led to some cynicism.
At that point, (not provided) traffic made up a smaller proportion of total organic referrals, perhaps 10 to 20% of traffic, but that has changed.
Now, since the introduction on encrypted search for Firefox and Chrome users and finally, Google's decision to redirect all traffic to the HTTPS version of the search engine in September 2013, there is precious little keyword data left.
For Econsultancy, 95% of all organic search data is encrypted, while notprovidedcount.com puts the overall figure at 85.83%.
In a nutshell, marketers now have so little keyword referral data that it has become almost pointless in attempting to learn anything from it.
Consequences of (not provided)
Previously, keyword data could have been used to learn more about the traffic arriving at your site, and whether efforts at optimising certain keywords were effective or not.
It could also have been used to optimise landing pages, and providing a better user experience for visitors.
Now, half a year on from the almost complete removal of keyword data, marketers are seeing this as a major barrier to effective SEO, second only to lack of resource.
32% of companies and 29% of agencies see the rise of (not provided) as a major problem.
Which of the following are the biggest problems preventing you / your clients from being as successful at SEO as you would like?
Also, while 44% of companies said they are able to track ROI for paid search, only 31% are as confident in their ability to track ROI for SEO, possibly due to the ‘not provided’ issue.
The data is gone for good, what next?
While bemoaning the loss of keyword data is understandable, marketers now need to adjust to the new situation.
A smarter approach to organic search is needed, as Kevin Gibbons points out:
I think it drives us more towards having an integrated digital strategy, where SEO is measured as a single channel of a wider marketing campaign. That will force people to move towards a more multichannel approach (if they haven't already), using clearer business metrics, which will make SEO far more measurable, rather than less.
As SEO has evolved a lot more towards a content-driven approach, one thing I have found increasingly useful is the ability to analyse organic performance per page, as opposed to keyword.
That way you can figure out what content is resonating with your audience best and being rewarded by Google as a result. I'd expect to see this shift continuing, as it's more actionable in the sense that you know what's working, so do more of it!
A Moz survey from earlier this year looked at how marketers were adapting to (not provided).
Here’s how 3,700 industry people answered the question “how do you cope with (not provided)?”
- 69% focus on conversion rate and performance metrics.
- 66% focus on landing page traffic.
- 58% rely more on Google Webmaster Tools data.
- 41% try to estimate traffic based on other data.
- 37% focus on social signals (tweets, likes, +1s).
One thing worth pointing out is that this is a level playing field. All brands in all sectors are facing the same challenge and it's a case of looking elsewhere to improve search performance.
There are also a few alternatives and workarounds that can help improve understanding, such as using site search data, and keeping a close eye on Webmaster Tools.
Or you could just demand keyword data from visitors with an intrusive pop-up. What could possibly go wrong? (thanks to @RavenJon).
What do you think? Is (not provided) one of the biggest barriers to SEO? How are you dealing with this?