Hopefully what follows will convince you that a lot of SEO is about pleasing the user, who must come first. The search engines may (largely) run on robots, but these robots are becoming increasingly human-like in their preferences with every algorithm update, and they know when they’re being manipulated.

If you’re a complete beginner to the world of SEO, check out What SEO beginners need to know: a basic skills guide on the Econsultancy blog for a good primer. If that whets your appetite, read SEO Best Practice Guide: The Fundamentals of Search Engine Optimisation to take it further. Econsultancy also offers SEO training.


All search engines determine the relevance of the pages they index using their own algorithms. Improvements are being made to these algorithms on a daily basis to make the search engines smarter.

Although the exact calculations that algorithms make in order to rank pages are a secret closely guarded by the search engines, Google does report the changes that will have the greatest impact on search results, such as 2011’s Panda, which penalised sites with low amounts of content of little value, and 2012’s Penguin, which took aim at poor quality links.

SEOs who understand these changes will be better equipped to optimise their sites for what Google is looking for, and brief marketing teams so they can implement content strategies.

SEO Best Practice: Algorithm Updates outlines the most important updates to the Google algorithm since 2003.

Audience analysis

Before drawing up a new SEO strategy, a business must understand what its customers – potential and existing – are looking for, and work out how that aligns with its offering. Audience analysis is the process of researching the behaviour of a target group, which involves determining keyword importance as well as identifying common needs and pain points.


For businesses looking to optimise their sites, carrying out an audit is often a great place to start. A site audit shakes out any of the issues that will affect the website’s visibility on different search engines, such as duplicate URLs or a slow site speed. Tools such as Screaming Frog let you carry out a technical SEO audit and spot any backend snags.

Econsultancy’s SEO Best Practice: Benchmarking and Auditing report walks through the main steps of a website audit.


A backlink is simply a link from one website to another. Search engines generally see backlinks as a sign of quality, finding that if many websites are linking to a particular page, it must be useful. Search engines have long used backlinks to determine organic rankings.

However, these links must be deemed natural by the search engines. If it appears the backlinks are a result of an aggressive link building campaign, built simply to boost rankings rather than to enhance the user experience, the site will be penalised.

For more info, read SEO Best Practice: Link Strategy and Tactics.


Along with carrying out a site audit, it makes sense to put some benchmarks in place so that it’s possible to monitor performance as time goes by, and work out what tactics are paying off.

Businesses can measure the performance of their traffic, conversions or processes against either another business, or against a group of businesses whose specific KPIs are known.

Benchmarking comes in three flavours: internal benchmarking, where the organisation plays against itself; competitive benchmarking, where the business compares its performance with that of a competitor; and strategic benchmarking, which considers the strategies of other businesses.

Econsultancy’s SEO Best Practice: Benchmarking and Auditing explains how to benchmark company performance against external measures to get an edge on the competition.

Black/grey/white hat SEO

Much of SEO is a fine balancing act. While there are search engine-approved tactics a webmaster can take to give their website the best chance of topping the SERP, such as using Schema markup (more on that later), overdo it and the search engines may penalise you for not playing fair (more on that later, too).

The things webmasters can do to optimise their sites fall into the following three fuzzy categories that go from agreeable, via dubious, to dodgy: white hat, grey hat and black hat SEO. You’ll probably have a good feel for what kinds of practices fall where on this scale, but as a quick guide:

  • White hat SEO refers to those practices given the thumbs up by Google, including technically optimising the site and posting content that is helpful to users
  • Grey hat SEO covers practices like buying links or software packages to boost visibility. Not strictly verboten, but walking a fine line as the motivations behind it do not prioritise the user experience
  • Black hat SEO covers the predatory stuff, like hacking websites, or buying WordPress plugins that inject links into unsuspecting sites. Black hat describes practices that actively break the rules and sometimes, even the law.

Breadcrumb trail

A breadcrumb trail is a set of clickable links that shows the users what level they’ve navigated to on a site. For example, if they were admiring a tie with an attractive toucan pattern on an online tie store with a particularly wide collection, the breadcrumb trail might go:

Home > Ties > Neckties > Classic width neckties > Patterned > Birds.

Although they let Hansel and Gretel down on their first IRL outing as a navigation tool, breadcrumb trails are still beloved by users and search engines alike. They give the former an idea of where they are on the site, and an easy means of browsing back, while they give the latter a good idea of a website’s hierarchical structure.

Note that search engines might display the breadcrumb trail to a page rather than the page’s URL in the SERP.


Another one of the SEO pitfalls, cannibalisation is what happens when several of the pages on your site rank for the same keyword and are effectively competing with each other. Competing pages may appear in the SERPs but ultimately won’t rank as well as they would on their own, as the search engine cannot work out which page to favour.

Cannibalisation is a bigger issue on larger websites with many pages, as it isn’t always easy to spot but can have a big impact on search rankings.

Tools offered by Moz, SEMrush and Majestic can help spot keyword cannibalisation and fix any issues.


It may be the case that your site has several URLs pointing to the same page. When the spiders come to crawling your site, they’ll find all those URLs with the same content, and as there is no canonical version assigned, they’ll choose one URL of these to rank (usually the one with the most inbound links, but not always).

Why is this a problem? Loss of link equity (covered later in this briefing). You need quality links to a website to make it rank higher than its competitors. If there are many different links to the same content, the value of that content is being spread too thinly across all of those links. In other words, those links are being diluted, which has a big impact on your ability to rank.

Canonicalisation, or assigning a canonical version of the URL by means of a canonical tag, tells the search engines which URL is the ‘master’ version of the page, and therefore which to serve in the search results.

Econsultancy’s SEO Best Practice: Technical SEO report explains how to assign a canonical version of a URL.

Conversion rate optimisation

If conversion rate is the percentage of users to the site who take a desired action, which could be anything from viewing a video, filling in a form, or making a purchase, conversion rate optimisation is the practice of optimising the site to make these conversions more likely.

Site owners can boost their conversion rate by doing one of two things: increasing the actions made by the same audience, or registering the same number of actions from a smaller, but more interested audience.

Although conversion rates can serve as a good metric for measuring engagement, there is no useful conversion rate benchmark as industries, products, people and markets vary so much. As such, there is no ‘best practice’ for conversion rate optimisation – optimal conversion rates will depend on the business need.

That said, a good place to start would be to establish the brand’s goals and what the user wants, before determining how its product and proposition fits with those goals and the company’s available resources. Testing and experimentation play a considerable part in conversion rate optimisation, which is covered in Econsultancy’s SEO Best Practice: Landing Page Optimisation report.


When websites are discovered by search engines, it’s by the process of ‘crawling’, the crawling being done by web crawlers, bots, or spiders. These crawlers look for new or updated webpages by following links or reading sitemaps. They store copies of the pages they discover in the search engine’s database in a process known as indexing.

Fundamentally, the SEO’s job is to ensure their site is crawlable, i.e. easy for the search engines to retrieve every page on a website (though humans are an important consideration too, as we’ll find later in this briefing).

According to Google, if its crawlers miss a site, it is most often because that site isn’t well connected, its design doesn’t allow for easy crawling, or it receives an error when trying to crawl the site.

Featured snippet, or ‘position zero’

Featured snippets are the search results that appear above Google’s standard list of 10 organic search results. They usually appear in their own box, and aim to answer the searcher’s question directly on the SERP, so that the user doesn’t need to click through to a website. The featured snippet is also known as Position Zero.

Google doesn’t provide guidance as to how to appear in a featured snippet, however, creating content that addresses a need or pain point and structuring that content in a way that is understood by the search engine is a good place to start (see ‘meta tags’).


In SEO, keywords are simply the words or phrases that a searcher enters on the search engine page. They’re the language that site owners need to speak in order to attract new customers to their site, serving as something of a bridge between what the searcher is looking for and what the site can offer.

In the early days of SEO, keywords were seized upon as a sure-fire way of securing traffic, with optimising techniques operating on the concept of ‘if some is good, then more must be better’. The result was webpages stuffed with keywords, sometimes beyond comprehension.

The search engines have long since wised up to this approach, and will penalise sites that appear to be keyword stuffing. Today, best practice is to work keywords – along with synonyms and modifiers, if possible and appropriate – naturally into the content. Keyword density targets are a fallacy.

Keyword research

If SEO is a practice that focuses on appealing to both humans and machines, keyword research very much covers the human side of this equation. Keyword research is the process of finding and understanding the search terms most frequently entered into search engines when people are looking for a specific product or service, or have a question.

Aligning your content with these queries directly serves these potential customers, who will be more likely to return to your site.

The benefits of keyword research go beyond SEO – the findings will also help marketing teams better understand their target market and what they’re interested in, which could inform content strategy or even the products and services it offers.

Tools including Google Keyword Planner, Ahrefs and SEMrush can help SEOs discover the most important keywords in their industry, and how they vary in different countries and different times of the year.

Read about how Hanapin Marketing used keyword research and creativity to optimise its Facebook ads, boosted engagement for a data security company in this case study.

Link equity

Link equity describes the value that a link passes from one page to other, also known as ‘link juice’. According to Moz, search engines use links that pass equity as a signal to determine how a page should rank in the SERPs. The amount of value a page can pass depends the number of links coming into the page, both from the site and outside the site.

For more info, read SEO Best Practice: Link Strategy and Tactics.

Local pack

The ‘local pack’ is the list of three local businesses that usually appears when a Googler searches for a service. Also known as a ‘3-pack’, it includes map locations for search results. As explained in Econsultancy’s Local SEO Best Practice Guide, search engines use review ratings as a ranking signal, which can influence your position in the local pack.

Long tail keywords

Sometimes called the ‘tail of search’, long tail keywords are those search queries that are made less frequently, though are more specific in content. For example, there are fewer searches for ‘toucan print neck tie’ than there are for ‘tie’. Note that ‘long tail’ doesn’t refer to the number of words in the search query.

If the goal is to drive traffic, it may make more sense to aim for long tail keywords, as they’re less competitive, and suggest greater intent – conversion rates on long tail keywords are much higher than for top-level generic keywords.


Markup describes the code that is added to the HTML of a page to help crawlers understand its content. There are many different types of markup that are used in different situations, though probably the most commonly used is Schema, which was developed by Google, Microsoft, Yahoo and Yandex.

Schema markup gives the content on webpages more some context, and as it helps search engines better understand the content on a page, can help with ranking.

Using schema also makes your content more likely to appear as a rich result on the SERP (such as a featured snippet). Google uses a kind of markup – structured data – to create rich snippets that may include information like review ratings, stock level, a product image or price.

Meta tags

Meta tags provide search engines contextual information about a webpage, such as the keywords it is associated with, the language it is in, whether it should be indexed, or if the page is canonical. They are part of the HTML of the page and are invisible to the user.

There are many different types of meta tag, though we’ll focus on three here that appear to be the most useful to SEO: title, description and header.

The title tag is displayed in the browser tab and may also appear in search results. Title tags should make sense – lists of keywords will not cut the mustard. It appears that the closer to the start of a title tag a keyword is, the more helpful it is to ranking and the more likely a user is to click through.

Description tags populate the text below each linked search result in the SERP. They can be helpful for ranking, but aren’t critical, as search engines can generate their own descriptions using the page content as a reference.

Header tags, which are designated by the tags <h1> up to <h6> (where <h1> is the most ‘important’), are good for organising content on a page. Where the <h1> tag is generally used for page titles, <h2> tags often denote the ‘chapters’ within the text on a page.

Using these tags not only makes your content easier to navigate – if used intelligently in conjunction with informative content that serves a need, header tags can land you in the coveted featured snippets spot. Research by HubSpot has suggested that featured snippets pull titles from the <h2> tag and items from the <h3> tag.

Off-page SEO

Where on-page SEO describes the changes you can make to elements on your website with the goal of making it more attractive to search engines, off-page SEO refers to methods used external to the site that can boost its popularity, such as by link building (organically, we stress) or by promotional means, working with journalists or social media.

On-page SEO

As noted – just now, actually – on-page SEO describes the modifications a site owner can take directly on their site to help boost its ranking in the SERPs. More specifically, it refers to changes that can be made to a site’s content, including metadata.

The idea with on-page optimisation is to help the search engines understand the content of a site, though it’s important not to over-optimise. Instead, make your priority delivering relevant, useful content in the most appropriate context. Otherwise, you may run into the following…


‘Penalty’ is the general term given to a suppression of rankings enforced by a search engine when a site is deemed to have breached its guidelines. This could result in part of a site or the whole site losing its ranking for a period of time, usually until the situation that caused the penalty is rectified.

There are two types of Google ‘penalty’, known as true and algorithmic. Algorithmic penalties happen when a previously good practice suddenly falls outside what is allowed by the search engine. True penalties, or manual actions, are applied to sites that are deemed to be breaking the rules by a human assessor. ‘Penalty’ is a loosely defined term, and so some SEOs would only regard a manual action as a real penalty.

To avoid these suppressions, or ‘penalties’, organisations should make regular checks for changes in guidelines and regulations for the search engines, particularly Google, which frequently modifies its algorithm. Google’s Webmaster Guidelines serves as a good rule book for beginners.


Introduced in 2015, RankBrain is a part of Google’s core algorithm that uses machine learning to improve search results. Since its introduction, Google Senior Research Scientist Greg Corrado has confirmed that RankBrain is the search engine’s third most important ranking factor.

According to one study of 1.4 million search queries, which compared pre-RankBrain results with those generated after launch, there was a 54.6% improvement in quality of search results for a query – RankBrain was weeding out the irrelevant results.

Site speed

Site speed is a key consideration when it comes to SEO, for a few reasons. Web crawlers must be able to deliver the pages they crawl quickly – if the site is slow, they simply won’t crawl as many of its pages, which will have an impact on rankings. Particularly slow pages may even be removed from ranking completely.

Site speed is also important to your human visitors, who are more likely to engage with a faster site. Google also uses these engagement signals to determine its rankings.

Think about the speed on your site on mobile, too. According to this blog post by Google on mobile site speed, bounce rate dramatically increases with load times, which could reduce overall conversions on mobile – not ideal. “In short, speed equals revenue,” the blog says.

As a quick diagnostic, use this PageSpeed Insights tool by Google which analyses webpage content and generates suggestions for making the page faster. Econsultancy’s SEO Best Practice Guide: Technical SEO also provides guidance on improving your site’s speed.

Read about how Buddy Loans increased its conversion rates by 17% by making technical improvements to its backend, including increasing its site speed, in this case study.


A sitemap is an XML file that brings together all the URLs for a website along with their associated metadata, such as when the URL was last modified, how often its content is changed, and its importance relative to other URLs on the site.

Sitemaps not only make it easier for search engines to discover a site’s pages, they let them crawl the site more intelligently, helping them prioritise which pages to crawl. As Rebecca Sentance writes in the Econsultancy blog post What SEO beginners need to know: a basic skills guide, sitemaps are particularly useful for sites whose pages aren’t easily discoverable by Google, e.g. if there are few links to them.


Another word for crawlers – what search engines use to discover and index the content on websites.

There are spidering tools available that allow SEOs to audit their websites, such as Screaming Frog’s SEO Spider Tool, which can be used to spot issues such as URLs that do not have a canonical version set.

Econsultancy’s Guide to Digital Marketing Tools gives a practical overview of 60 popular and peer-recommended tools, including those used in SEO practices.

Split testing

Also known as A/B testing, split testing is the process of running alternate versions of the same page simultaneously to determine what works best.

It comes in other flavours, including multi-variate testing (MVT), where numerous elements on a page are swapped in and out, but the fundamental idea remains the same: show one part of the audience – the control – the original version of a page, say, while showing another – the variant – an alternative. Compare their performance with the metric of your choice (that is aligned with the company’s KPIs) and adjust the page if the variant produces better results. Lather, rinse, repeat.

Split testing content allows webmasters to identify opportunities and structure their landing pages for the best possible performance at the time. It is an iterative process – as audiences and their circumstances change, site owners must always be experimenting and optimising.

User-generated content

User-generated content (UGC) is content, such as reviews, or social posts with video or images, created by customers rather than by brands. UGC can help improve conversion rates for ecommerce brands, as customers are more likely to trust the reviews of their peers. But what does UGC have to do with SEO?

As it provides a constant feed of new and engaging content, UGC can be great for SEO. New pages with new keywords may boost the site’s visibility while the content’s authenticity nurtures trust.

Consider allowing users to make their own contributions to a site, whether that be by allowing comments on a blog, or publishing their product reviews. While you could use an external platform, such as Facebook, to open this interaction with the customer, ensure that any resulting content is kept on your domain to reap the SEO benefits.

SEO Best Practice: On-Page Optimisation offers further guidance on how to use UGC to boost SEO.