2013 saw many changes that affect the role of the SEO, most of which were instigated by Google. Some were good, some not so good.
The final removal of keyword referral data was the most obvious inconvenience for SEOs, but Google has also been busy tweaking its search results page, with more prominence for paid ads.
I’ve asked a number of SEOs for their views on the least welcome changes from 2013, as well as their hopes for the next 12 months. Please let me know yours in the comments.
From a SEO’s perspective, which of Google’s changes in 2013 have you least appreciated?
Julia Logan, Irish Wonder:
Where Google is going with the knowledge base. It is simply becoming a scraper, and on top of that it cynically advises the website owners to produce more great content.
Yeah right, so Google has something to scrape.
Dr Pete Meyers, marketing scientist at Moz:
Again, [not provided]. Easily.
Andrew Girdwood, media innovations director at LBi:
I dislike the 100% not provided and the communication around it. Google’s claims of ‘for privacy’ are too easily dismissed until you begin to speculate what other information Google might want to include in search results.
For a company that once claimed to want to help with the world’s information it seems a pleasant move would be to pass keyword data through their improved privacy.
It could investigate a redirect link with a keyword loaded query string on it, for example. Google has established, with its PPC policy, that keyword data is not a privacy breach.
Will Critchlow, founder and CMO at Distilled:
For immediate impact, (not provided) is my least favourite change. Keywords aren’t everything by any means, but they are useful and the public explanations given are just so disingenuous.
If it really meant the explanation given about privacy, they not only wouldn’t be giving up paid keyword data, but also could have found a sensible middle ground of what to share instead of removing it all.
Directionally, I’m also very much not crazy about the UX changes to image search which seek actively to prevent searchers from visiting the sites that contain the images.
I see this as breaking the implied agreement whereby sites allow Google to crawl their sites in exchange for getting traffic when that crawl discovers a good result.
Richard Baxter, CEO at SEOGadget:
Well, I think Panda’s increasing aggressiveness has largely been ignored since Penguin came along. I’ll choose that update, but I actually appreciate it.
It’s an interesting update to work with – when you get really deep into technical work, particularly into the log files of affected sites, you can really see why there’s a problem. The trick is to compare before and after.
What you find with log files is that they tend to confirm what most SEOs just say because they think it’s best practice. Very ‘thin’ pages with little unique content tend to encourage weird crawl behaviours.
For example, much less of the page (total size vs content downloaded) is downloaded by Googlebot than say, a content rich, really developed page. So, as much as I don’t appreciate my job being harder, I certainly appreciate it being more interesting!
What do you expect/hope to see in 2014?
I dream of a strong competitor rising so we all have some choice, as searchers, site owners and SEOs. This probably won’t happen very soon, maybe not in 2014, but can I dream?
Dr Pete Meyers:
I suspect a strong shift to a more card-based search result, akin to Google Now, Google+, and mobile search.
Google wants to mix and match information seamlessly, regardless of how you consume it. I believe we’ll see a Knowledge Graph expansion based on Google’s index – in other words, it’s going to extract ‘knowledge’ directly from websites more and more (and not just a small set of big databases).
I hope it’ll open some data back up and become more transparent, but I don’t expect it.
Kevin Gibbons, UK MD at Blueglass:
Bigger and better marketing campaigns. Less focus on tactics, and more integrated strategy across multi-channels.
We’ve certainly seen a shift ourselves towards a more consumer-led and customer centric strategy, looking to improve the overall user experience across multiple channels and devices.
Focusing much more on the bigger picture and being rewarded by Google as a result – as opposed to more tactical bursts of campaigns.
Hope and expect are very different. I hope to see keyword data made better in Webmaster Console along with easier data extraction from the console. I doubt we’ll get that.
I expect to see more chat around Google+ and for Google to fuel that. I predict more SEO teams will spend more time talking about ‘signals’ rather than just ‘links’.
I fear we’ll see trouble when it comes to the difference between editorial and advertorial. The difference does not seem to be widely understood by very many bloggers and digital publishers.
Whether it’s in-house teams or SEO agencies doing the outreach doesn’t seem to matter but too many brands seem too happy either play to those misunderstandings or actively encourage them.
I expect to see some innovative YouTube ad formats that could set it on the way to becoming a real brand-building platform for the web and see it claim a significant chunk of brand advertising spend.
Coupled with this, I think we will see a subtle shift away from UGC and towards professional content on YouTube – it could become the equivalent of free-to-air TV versus Netflix’s cable equivalent.
I think Dr. Pete is spot on in his predictions of what we will see on the UI front. I expect to see some live experimenting with more social ranking factors, particularly in the fresh results.
Teddie Cowell, director of SEO, Mediacom:
I expect to see a lot happening around interaction, with search results appearing in more places where we haven’t previously seen them – think of the Android 4.4 Kit Kat contacts list as a current example of this.
I hope to see some controls mechanisms in place for Knowledge Graph. There have been a few too many factual inaccuracies; and some highly embarrassing ones, so currently it feels like Google is playing with fire in regards to what the Knowledge Graph says.
With particularly regard to brands, which by their nature as recognised entities are more likely to trigger the Knowledge Graph, it’s very dangerous territory, because coincidentally they are also some of Google’s most valued advertisers.
Spammy SEO to be gone. Google keep saying it’s getting better at tackling the bad stuff but there are still plenty of examples around – it’s a case of ‘do what you said you’d do’ – and not giving very poor quality SEO agencies any more fuel by way of case studies on their bad, temporary tactics.
It would make it much easier to get the message across that the good guys do a good job and that SEO is a credible, technical and content marketing discipline that is very much here to stay.
Jimmy McCann, head of SEO at Search Laboratory:
Improved accuracy in the Webmaster Tools link examples that are given under manual action. These have been automated and incorrect in the past – which is a pain.
It would be better if the examples were more explicit and told you exactly what was required to sort out the penalty.
Adam Skalak, head of SEO at iCrossing:
The most significant shift for SEO is that we are no longer limited by keyword phrases. Google’s 2014 expansion of its semantic-search offering, the Knowledge Graph, offers both opportunities and challenges for brands.
Established brands will benefit from greater exposure in more prominent parts of the results pages. But with Google providing answers directly at the top of the page, brands may struggle to increase their organic traffic as this could remove the need for people to click through to their site at all if they don’t need too much detail.