But before I start, I want to overtly caveat this blog post to state 7thingsmedia does not condone the use of negative SEO.  This is purely a topical point to discuss within the industry – whether taboo or not!

Until recently, negative SEO stayed out of the spotlight, lurking on the darker, more subversive areas of the net, like BlackHatWorld.com. With significant visibility drops for USwitch, Halifax, William Hill and Dialaphone and after Expedia’s recent penalty and subsequent share price decline, the topic is the subject of heated discussion in the SEO world.

Funnily enough, in Expedia’s case, there was no real suggestion that any negative SEO was at work. It was more a case of Expedia using aggressive and out-dated techniques to scale its link profile in a fairly autonomous manner (by injecting an anchor text rich backlink to Expedia into Travel Blog templates – amongst other techniques).

Regardless, this has still posed some pertinent questions in the industry regarding potential growth of this quintessentially blackhat tactic.

So what is negative SEO in 2014?

Common negative SEO practices currently all revolve around one of the main factors in determining a site/pages ranking ability, namely external links. This is because external links are the one area of an SEO campaign that cannot be controlled explicitly by the brand/agency; therefore manipulation by third parties is always a risk.

So imagine Competitor A notices that Competitor B has moved above them in the SERPS for a crucial high volume term. Competitor A has some projects in the pipeline targeting the keywords, but is dealing with a manual link penalty. Competitor A decides if Competitor B too had an albatross round its neck it would give them a better chance in the war of attrition.

Ten minutes on Fiverr later, a link wheel is established sending thousands of anchor text optimised links to Competitor A’s site, targeting the domain or the desired page that outranks Competitor B. These links are low-quality, they come from new/de-indexed and irrelevant domains, with a total lack of valued content.

Now in many instances of negative SEO, the damage can be short-lived, as webmasters can review their links, performing iterations of link removal and disavowing, in the hope that a manual penalty won’t be received.

However when a link wheel is used, Competitor A will find itself playing a constant game of ‘link whack a mole’, where its outreach departments are constantly contacting webmasters for new poor quality links that surface each day.

This can not only detract from the campaigns that have been planned by the brand or search agency, but it can inflict some fairly serious overheads either from tool subscriptions for automated discoveries/analysis or man hours taken in reviewing the links one by one.

These impacts are nothing in comparison with the downturn that would be experienced with an algorithmic/manual penalty, so you can understand why so many companies are reviewing their historic links at this stage. Many companies previously selling links have adopted a new stance, monetising the removal of the previously placed links and in many instances, offering negative SEO as a managed service.

Comparisons

Let’s put this into a more real-world perspective, so imagine you’re the marketing director for Ugg Boots. A competitor goads a reporter into writing a story about how Ugg uses kitten fur to line its famous boots. The story gets shared a few times via social media and a couple of content aggregators pick up the piece.

The story might not reach many people, as its only shown in the local area, however a year later, when Trading Standards have received complaints from Ugg customers regarding the provenance of their materials, the article is dug up and revisited by the authorities. Now I’m not suggesting Trading Standards would believe a poorly marketed story with purely circumstantial evidence, however, it might affect its decision to investigate the brand and this is where many companies, plagued by legacy SEO issues caused by previous campaigns, would fall foul.

Ugg would be forced to allocate resources to challenging Trading Standards, effectively clearing its name.

Now in this highly litigated environment, a story like this wouldn’t even go to print, as a slander case would be quick on the heels of any reporter making accusations like these. But even in today’s world, where jail sentences are handed out for ‘anti-social’ (I despise that term!) behaviour on social networks, the same ‘real-world’ repercussions are not in place to deter contributors from communicating their slanderous accusations.

Let’s for a second imagine that the Expedia case study we’ve seen flung into prominence recently was a bonafide case of negative SEO. Reports showed the share price of Expedia falling by 4% shortly after the news hit that 25% of visibility had been nuked by the Google penalty. The share price downturn was clearly mediated well, as with 25% loss of visibility, one might expect more shareholders to jump ship.

Across digital, we often see a strong dependence on natural search traffic which makes these kind of punitive actions from Google even more impactful. These penalties can bring small and big brands alike to their knees, as was observed with the Interflora penalisation which took hold some few days before the second biggest peak for the flower retailer.

Expedia Visibility drop – Geekwire

Interflora drop – Martin MacDonald

The cost of performing negative SEO hasn’t really changed in years, if anything, it has become more widely available and at more competitive costs. This may be one of the reasons why we’re seeing more instances of this kind of digital warfare, especially with smaller brands in competitive niches.

How are Search Engines combatting this?

Well the honest answer is, they aren’t. With no way of identifying the origin of negative links, search engines like Google and Bing are unable to police effectively against these damaging SEO assaults. Instead search engines are left to fire-fight these cases on an individual basis, often relying on tip-offs from other competitors, which are of course again open to manipulation.

Without established detection triggers and methods, what can search engines do right now to detect negative SEO? Yandex has already reduced the importance of links in its ranking factors, some three months ago. This may be a potential solution, but what factors will replace links if this is the tact taken? Social?

We’ve already seen the extent to which social can be gamed, so potentially scrapping links is out of the question. A more reasonable, and likely solution would be to allow a certain proportion of malpractice links, or to detect unusually quick changes to the ratio of anchor text across a brands backlink profile.

These would not be iron-clad fixes in any way, but they would level the playing field in an environment where verticals are so saturated that every brand is looking for an edge over their competitors. At this stage, the responsibility for policing negative SEO is left firmly in the hands of us marketers.

If we can protect our brands, or better yet, prevent our brands from being targeted in the first place, by being outwardly altruistic and vocal in their verticals, then this should form part of all exhaustive SEO strategies.

How Can I Detect Negative SEO?

I will always be a staunch supporter of manual reviews, however there are one or two useful utilities out there to help you identify potentially damaging links.

If you’re short of time they are certainly worth considering, but remember that you will usually have to sense check many of the links manually, as often these tools will highlight sites with poor trust flows but they won’t highlight websites that are reputable, but unfortunately totally irrelevant.

Tools

Once you’ve got your tools/excel sheet setup, you’ll need to gather links from multiple sources to ensure you have the full link profile picture.

  • Google Webmaster Tools: download your links from GWT by selecting ‘Search Traffic’, ‘Links to Your Site’ and then by selecting ‘Download the latest links’. You can also download all the domains here, but please note that on any given day the sampled data you are given can vary, so to get a full picture, download your links across numerous days and compile to remove any duplicates.
  • MajesticSEO: go to the horses mouth – Majestic provides an awesome amount of data to many other third party providers in the SEO industry and because it’s been scraping this data for so long it has the most comprehensive index of links but significantly, a timeline for when those links were first discovered. This can be fantastic if you’re not cleaning up negative SEO, but stripping back the mistakes of a lazy SEO firm, meaning you are able to qualify links based on the date that they were acquired, compared against Google algorithm and Guideline updates.
  • Moz Open Site Explorer: ‘Just discovered’ is a great free utility where the most recently discovered links to your site are shown. Taking this data daily would be a quick and digestible method of quality controlling your incoming links once the initial link removal audit is complete.
  • Fresh Web Explorer: from Moz again, this shows you recent mentions of any keyword you put in or links to a specific URL. It even has daily alerts. This can be great for more complex and advanced negative SEO campaigns where false reviews and articles are generated to support malicious links.

Discussion

Now is the potential Fight Club (The first rule of Fight Club is: you do not talk about Fight Club) element.  Do or should we openly discuss this topic? I’d be intrigued to read both thoughts on the post and also wider contribution to this current SEO discussion.