Many large brands are thought to have fallen victim to a Google penalty for infringing the search giant’s guidelines against web spam.
How can you tell if your site might be at risk?
The consequences of being penalised by Google can mean a drop in search rankings and visibility and thus a reduction in online traffic and revenue, a damaged reputation and (for listed companies) potentially a reduced stock market value.
Google is becoming more effective at identifying web spam. Search engine optimisation (SEO) techniques that were once effective and commonly used are now more likely to be detected and regularly punished.
Simply put, Google is fighting against sites that try to game their way to the top of search results using spam techniques that infringe its webmaster guidelines.
These techniques are considered bad for search because they drive relevant websites lower down the search results, making pages from legitimate website owners harder to find.
The search engine’s algorithms detect many spam practices automatically, demoting the sites that use them. It also employs teams who manually review sites for spam activity.
Here are some key questions that marketers must ask if they want to evaluate their risk of a Google penalty. They are grouped under four key headings.
Google wants searchers to land on web pages that provide good quality, useful content that add value and are relevant to their search query.
As a marketer you need to ask a number of questions about your site’s content and the way Google’s crawlers will view it:
Is our content duplicated on multiple pages on our site or externally?
If a lot of your content appears on other web pages it means your pages are not adding value and can be a reason for not getting to the top positions in Google.
Do we show the visitors to our site the same content as we show to Google’s data crawler?
Displaying different content to Google than you are showing human visitors (which is technically possible) in order to try to gain a ranking advantage is known as ”cloaking“.
And this type of tactic is targeted by Google’s Panda update which was rolled out in 2011 and focused on reducing the volume of unhelpful and irrelevant content that appears in searches.
Have we used tricks on sub pages such as white lettering on a white background or hidden text?
As Google’s crawlers analyze the words used on a page to determine the content’s meaning and, in the next step, the keywords it should rank for, some webmasters use this sort of trick to insert additional keywords into their pages, as the white lettering makes them invisible to human visitors of the page.
This is a clear signal of spam and can very quickly lead to a penalty.
2. Buying links:
The number and quality of links to your site from other sites is one of the factors that Google uses to determine rankings.
So if your site has many inbound links (backlinks) from high quality, well respected websites, this is thought to be a ‘vote of confidence’ for the content on your pages and accepted as having a positive influence on your rankings.
This is why some webmasters build links artificially by paying other webmasters for linking to their page. Often, such links are disguised within guest posts.
But if you are paying for links then those links are not genuine and are an infringement of Google’s webmaster guidelines.
Questions you should ask in this area are:
Do we regularly check the websites that link to our pages?
You need to review and eliminate any links that were originally good, but have now gone ‘bad’.
For example, some of the original pages or sites that link to your site may have changed, changed ownership or may no longer be relevant.
Did we pay for any other sites to link back to us?
Paid-for links are not allowed by Google, as discussed. T
he search engine also puts link networks (a network of sites or blogs with a large number of reciprocal links in order to deliver a rankings benefit) and pure SEO web or article directories (online directories set up purely to provide backlinks to help SEO professionals boost their sites’ rankings) under the same umbrella as paid-for links.
Have we paid third-party sites for posting several guest articles linking to us in the past few years or do we have several guest articles on our site?
Placing guest articles incorporating backlinks to your site on other sites is a common SEO tactic but paying for these articles just for link building reasons is against the rules, so a large number of guest articles is a potential spam signal for Google.
Guest posts do only make sense if they are also relevant to the site posted on as well as its visitors with regards to content.
How natural is your link profile?
It is believed that Google analyses the in-bound links to a site’s pages and compares the profile to what would be naturally occurring ie not artificially manipulated in order to trick Google into deliver rankings benefit.
So you should ask a number of questions about the overall structure or profile of your links:
Do we have a lot of reciprocal links i.e. the result of a simple link exchange between us and other sites?
This is a common spam practice that Google can recognise.
Do we have a lot of (keyword) links in the footer?
In the past it was very common in the SEO industry to incorporate many keyword links into the footer of company sites.
But now Google is becoming better at identifying this kind of ’link optimization’ and it can potentially have a negative impact on rankings.
Do we have lots of ‘bad/spammy’ links to our page?
Having a lot of links to your site which come from other pages’ sidebars and footers, or from pages stuffed with links, are thought to be viewed negatively by Google.
Do we have many links using (only) keywords as the link’s anchor text?
Too many links to your page which are anchored on the exact keyword terms the page is supposed to rank for (ie: a link to a page dealing with payday loans via the anchor text “cheap loans”) is believed to hurt the value of your link profile.
It is thought Google is likely to devalue those links, assuming they were created for the purpose of gaming the system.
Google’s crawler can analyse the software underpinning your site, so you should ask questions about whether there is software technology which could be malicious to visitors or is trying to fool its algorithm into giving your site a false rankings advantage:
Have we ever had malware on our site?
Sites containing malware can be discovered by Google’s web crawler and excluded from its search listings index, which makes it very important to pay attention to all Content Management System (CMS) or other software updates for your site.
How many of the pages in our XML sitemap will actually be crawled by Google?
The sitemap is a table of contents the web site owner creates which can be read by Google’s crawler.
If the number of pages indexed by Google is bigger that on the sitemap, then there’s a chance you might have duplicate content on many URLs.
This issue is very common for ecommerce sites and is likely to be the worst from this list for SEO. A single category page on some retail websites could have over 100 variations of it’s URL, due to the many combinations of parameters for facets / filters.
Here is an example of how a duplicate content issue caused by faceted navigation could arise:
Top version = unfiltered category page | Bottom version = filtered version of the same page.
If Google indexes fewer pages than listed on the sitemap, then it’s an indicator that you could have ‘thin’ content it doesn’t want to show searchers.
Both are a negative quality signal that may indicate Google does not trust your content.
Do pages take a long time to load?
Not only do lengthy loading times for web pages reduce potential conversions (because web visitors get fed up of waiting), they can also lower your rankings and even (if the load time is excessively long) block Google from indexing it.