Enter a search term such as “mobile analytics” or browse our content using the filters above.
That’s not only a poor Scrabble score but we also couldn’t find any results matching
Check your spelling or try broadening your search.
Sorry about this, there is a problem with our search at the moment.
Please try again later.
Aaron Wall has an excellent post today discussing how websites can use the long tail to generate a lot more reliable traffic than simply going for the major keywords in a particular niche.
Consider an average e-commerce store selling thousands of books. It would be great if the site could rank number one for the keyword 'books' but in reality that isn't going to happen without a lot of work.
A much easier solution is to make sure your website ranks number 1 for the names of all the products you sell. This means you have thousands of number one rankings for terms that will probably only send 1 visitor a month. These visitors all add up to a very sizeable chunk of traffic and are highly lucrative.
Visitors searching for long tail product terms are focused on buying that particular product and convert at much higher rates than somebody searching for 'books'.
Although achieving long tail rankings is easier than getting rankings for major keywords, it is still quite hard for some e-commerce websites to overcome indexing issues built into their systems.
To take full advantage of the long tail, websites need to be very well structured and have basic elements in place to avoid duplicate content and 'bloating' issues which are caused when a content management system generates hundreds or even thousands of useless pages.
A good tip is to check all the different pages on your site and see if they are being indexed by the search engines. If they aren't, try to find out why. Common issues include a lack of PageRank caused by too few links pointing to that page or the fact it is very similar to one of the other pages on your site.
Each website has a finite amount of PageRank to share between its pages. If you have 10,000 pages then each page will have a lot less PR than if you only have 10 pages. It follows then that larger sites need to work much harder on link building if they want their long tail rankings to scale throughout the site.
Some webmasters submit an XML sitemap to let the search engines know about all their pages but this shouldn't be seen as a quick fix for a badly designed website or lack of PageRank.
Try to solve your indexing issues manually first and it will pay dividends in terms of better rankings in the long term.
Related research: SEO Best Practice Guide