Copywriter at HappyCopy
13 April 2009 21:55pm
I am a journalist with a sideline in SEO copywriting and I have a question I hope you SEO experts can help me with.
While I create ghostwritten blog posts which naturally contain keywords and strive to add value to a site and encourage inbound links, I notice a growing number of companies are simply creating unique content which is still a rehash of existing content.
For example, there has been a considerable rise in firms summarising popular blog posts etc but doing so in a way which ensures the words they use are unique and not quite plagiarism, with the original blog posts being fully cited.
So, here is my question: Does this kind of content help? Obviously it is not going to build inbound links but is simply having regularly updated keyword rich content enough to climb rankings?
What I suppose I am asking is whether or not I should be also offering this watered-down copywriting service? It would certainly be easier than the content I create at present.
Does Google still care about unique content or is it, as I thought, all about inbound links?
All information gratefully received!
Journalist and SEO copywriter
Technical Project Manager (MBA, MBCS, CITP, CEng) at Naxtech.com
14 April 2009 13:37pm
It's definately NOT all about inbound links. That was the case many years ago with the initial Google Pagerank algorithm. In these days the algorithm has changed significantly and in fact it changes practically every day or week.
But even in today's terms the pagerank algorithm denotes how "important" a site is and does not reflect how well it will rank for a particular search term.
As I have said previously pagerank is rough estimate of the overall importance of a web page. Many factors influence
Page Rank, thus it is a poor indicator of how well a page ranks for particular
keywords. Pagerank however does often effect the search-results rank of a
So, if there is a plethora of unique content (watered down or not) with actual value to the reader, I think it will certainly have some value. "Reader's Digest" pretty much exists based on this principle in the sense that it provides the same unique content but in a more easy and summarised form.
I hope this helps.
Deniswww.naxtech.com - web development and online marketing
SPECIAL OFFER: 30% Discount for all projects signed within 15-30th of April 2009)
E-Business Consultant at Dan Barker
14 April 2009 17:55pm
hi, Felicity, how are you?
I'm going to totally ignore all benefits other than SEO in this answer. There are a ton of benefits of extra content other than SEO, but I guess (as a writer) you're bought into most of that!
It's a bit of both. The ultra-simplified version is, you will rank first for a particular search term if:
(Note they all say 'page' & not 'site'.)
What that means is: If you don't have the unique content, you can't rank for it.
An Extreme Example
An example of a place where just the existence of the unique content counts way more than the page's potential to generate inbound links would be a big news site.
The editor puts in a call to the seo/content team: "Somali Pirates are big this week guys & I'm not seeing any Somali traffic hitting the sites yet. We need to own this in Google. What can we do?"
They know that they can't drive enough external links in a short enough time ('today') to get them there. And, as the story will be gone in 2 months, there's no point in trying to grab a bunch of 'somali pirates' inbound links to own the phrase long-term. Instead they do something like this:
Of course that's just an artificial version of what often happens naturally: Category pages are linked from homepages, article pages are linked from categories, and the pagerank all flows naturally.
A nice way to think about it is a shopping centre with a helpful guide at the entrance. He cares about 2 things related to your shop: 1) The number & importance of people who say "hey, Felicity's shop's really great" - this is your inbound links. 2) The list of products you sell that he knows about - this is your unique content.
Hope that helps - let me know if so.
14 April 2009 21:40pm
Hi Denis, hello Dan,
Thank you so much for such carefully considered and well written advice. I love that this sector contains so many people prepared to take the time to advise others.
Dan, thank you in particular for the examples, lots to think about.
se at http://www.dupecop.net
07 May 2009 06:59am
Duplicate content filters are found in the search engines. Internet has the standard tool known as crawler. This tool helps in filtering information so that it can be relevant for the users. There are instances where in spammers use tricks so that the key phrase will not be determined by the crawler. Thus, we writers need to secure the articles that we are doing. Writing good quality articles is a hard thing to do so we have to protect it from spammers. There are several ways to avoid duplicate contents. First is by creating spun articles.
Built on the foundations of our previous, highly-renowned report, Econsultancy's SEO Best Practice Guide contains everything you need to know about search engine optimization. At more than 300 pages long, this document will help you understand search marketing like never before. Make no mistake: this guide contains lots of actionable, real world insight. It will help you immediately start to improve your performance across the search engines.
Econsultancy's Product Pages Best Practice Guide examines the features online retailers should be using on their website pages to effectively showcase their services and products. The report contains valuable considerations that beginners and experts alike should be making and covers basic principles, such as copywriting, calls to action and image display. It also explores more complex components, such as user generated content and video.
Free market research on digital marketing
Daily Pulse: award winning newsletter
It takes 30 seconds to register