Understanding Panda

The change in the Google ranking, called Panda from the name of an engineer, impacted 11.8% of US website with poor content, not original or not very useful.

Panda algorithm and quality of a site

"We want to encourage a healthy ecosystem..." Google said.

Criticism that the firm has suffered - we remember the joke on April 1 Yacht Adsense of the CEO of Demand Media - wronged the search engine and it should react.

Panda was a program launched manually from time to time by Google to assess the "quality" of sites, that was then integrated to the organic algo in January 2012.
It calculates a modification factor for a site in order to change the previous ranking of pages based on other criteria. No other criteria of the algorithm does alter this score.

How the Panda algorithm change results

Theoretically, Panda makes the difference between a quality site and a site without interest as so:

Google’s algorithm looks for authoritative sites, consistently offering new information and innovative statements contrary to the one churning out five hundred words about a topic with no special background on it.

Another more recent quote:

Demoting low-quality sites that did not provide useful original content or otherwise add much value.

This sentence is self-descriptive. It is itself that said what is a low-quality site. But webmasters are looking for more precise criteria. Here is how Google assess the "quality" of a site, based on infos given by patent 8,682,892:

Panda was designed as a separate program because it requires vast resources to partition the Web in group of ressources and to compare them.

With Panda, Google wanted to radically change the actual role of the search engine: it does not want that the results can contribute to the promotion and success of a site, now, a site must obtain its audience only from links it receives and if he gets some success, then the engine could put it on top, depending on other ranking factors.

The discourse that was given on quality was misleading: it is just a matter of popularity, as popular sites always get lots of links whatever they publish, often infos taken from other sites. Reading the patent also shows that Google pays little attention to originality: a fully copied content can be better positioned than the original if it gets more independent links.

How to modify a site for Panda

What to do when hit by the Panda update?

From Google:

"Low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low quality pages to a different domain could eventually help the rankings of your higher-quality content."

But all experts agree that it is not possible to cancel the penalty without to change the content of existing pages and adding new content. Merging two pages with a banal content will make a new bigger page with a banal content, it will not solve anything.
The effort of the Webmaster must be focused on getting independent backlinks.

  1. Ideally a site maximizes the Panda formula if has many backlinks and no content. A service could match this schema. But Panda is only a factor that amplifies an initial score that depends on the content.
  2. Removing all pages that do have backlinks is certainly an effective way to improve its ratio and thus recover. Or exiting the index with a noindex meta tag.
  3. For pages whose engine cannot understand the interest for the user, you may enrich their content. But not if they have a lot of backlinks, changing the content could triggers other penalties (unless it brings them new bl).
  4. To get genuine backlinks, make sure your content offers something useful and unique (make a search for similar content). Always ask what your page offers more than others.
  5. Personalize the content. Use your own words. And, I talk to bloggers, remember your essays in school, the teacher did not ask you to copy the subject or the answer of someone else, but to give your own ideas. Use different view points to appear authorative and not biased.
  6. Take care of the user experience, the incentive to see more pages or to return to the site.
  7. For pages that have no chance of getting backlinks, or do well in the SERPs, make them dynamic and therefore invisible to the engines while answering the questions of visitors. This is what we do here with the dictionary (button on the top right), through the use of Ajax.
  8. Watch the exit rate in Analytics or other statistical tool. Pages that have a high exit rate penalize the site. They could be deleted or made dynamic if they have no backlinks.
  9. Again, do not modify pages with a lot of backlinks.

You should know that changing the existing content will not suffice to undo the effects of the penalty because it would not get new backlinks. This is especially new unique content that will do.
This will require a lot of work, but you will be consoled in thinking to content farms which have million pages to modify...

Conclusion

The most important fact which makes the results incomprehensible to webmasters, wich has been officially confirmed by Google, is that if a part of a site is penalyzed, the whole site will be penalized. So pages of very good quality will be less well ranked in SERPs behind pages of other sites of lower quality!
It is even more difficult to accept since we know that while Google has presented its process as a means of selecting quality pages, its main effect is to promote the most important sites and increase even more their audience.

Dates of Panda updates

The first two are official, the following are estimated:

See also

More infos