What is Google aiming to achieve with Panda 4.1?

Google’s Panda updates are always aiming for quality enhancement of the search engine results.

The first Panda update was known as the content farm update, because it was mainly directed against thin content that was placed on large scale platforms such as ehow or ezinearticles.com.

Panda 4.0 was a major rework of the whole Panda implementation in the algorithm. So Panda 4.1 is just the continuation of what Google did with 4.0.

Where are you seeing the biggest changes so far?

You can see that a Panda update was rolled out just on our list of winners and losers. There is always a pattern on the sites on these lists.

Panda updates are query based. That means that the impact is not on all queries, only a specific set of queries and that’s why you can find a pattern. The changes we saw this time are again directed against a lot of thin content and aggregator sites.

Sites like FindTheBest, Pearltrees or Socialcomments.org aggregate content and present it in a nice, easy-to-consume way.

But they are aggregators which means no unique content and it’s also aggregated like a search engine. And Google doesn’t want search-in-search to rank highly.

Based on what you have seen so far, has Panda 4.1 achieved what Google wanted?

This is a question that only Google can answer. Google is never finished.

Every new iteration of Panda or other updates will increase the quality. But with all the redundancy in the World Wide Web, Google has a lot to do to filter out the noise and increase the relevancy of the results step by step.

What kinds of signals does Google look for to determine low-quality, thin content sites and pages?

There are two different kind of signals. Quantitative signals are based on content and site structure. Qualitative signals are based on user behavior.

For the quantitive signals it is Google’s mission to find sites with unique and rich content. Content with an added value for the user. And with all the data that Google has, it is the next step to take the user signals and weight the results against a group of other results.

Results with worse than average user behavior are sites with a bad user experience. Google is using both signals to determine which results are relevant for the user based on their query.

When a site has too many ‘bad’ signals, the whole site can be dropped out of the rankings. Panda is not just based on some keywords on a website. It always hits a huge part of the page, that’s why Panda is a serious issue. 

What is your advice for sites hit by Panda?

Housekeeping! Many sites just have too much thin content. They have to add value to these pages, find redundant pages and merge them or just delete them if it’s old outdated content.

But importantly, if you don’t do anything it will get worse. These pages that just keep their thin content will have the so called slow-death phenomenon.

They gradually lose visibility over time until nothing is left. That’s why companies should build a Panda proof site before they are affected.

What do you think Google will be looking to deal with in the next update?

Google always wants to improve and to deliver the right result for the right time in the right context.

There will be many more updates where content relevancy will be the key.