Sifting through the Google Panda-Monium

Have you read any of the industry scuttle about Google Panda? What is Panda, exactly, anyway?

Google Panda is a significant update to the Google search algorithm that was launched late last month. While Google is constantly adjusting its ranking algorithms, most changes are not immediately or broadly evident. With Panda, however, we have a whole different story.

From the official Google Blog on 2/24/11:

    “…in the last day or so we launched a pretty big algorithmic improvement to our ranking—a change that noticeably impacts 11.8% of our queries—and we wanted to let people know what’s going on. This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”

Okay, fine. Sounds like typical Google-speak, whether you are a fan or a foe of the search behemoth. But what does this all mean to you? We’ve culled through the editorials to identify some takeaways.

Many of the sites that were negatively affected by Panda are “spammy,” untrustworthy sites with a high ratio of ads to above-the-fold content (i.e. word count.) And while Google claims data collected from Chrome’s site blocker extension does not factor into Panda, they stated the results were used comparatively to confirm the efficacy of the algorithm.

Quality Content
Google puts sites like Wikipedia and the New York Times on one end of the quality spectrum. On the other end of the spectrum are low-quality “scraper” sites and “content farms” like E-Zine and Squidoo. So the answer is simple, and nothing new: keep updating your site with new, unique, relevant content for best results. Moreover, it’s not just bad practice to buy back links on spam sites, it may get you delisted by Google altogether.

Less Frequent Indexing
If Google determines that your site is less relevant or useful, it will probably crawl it less frequently. You can gather data on when Google last crawled your site by referring to your server logs or Webmaster Tools. Understanding which pages Googlebot is crawling – and how frequently – can help you manage expectations for rank improvement. If only 50 of your pages are indexed, and only every 30 days, you won’t see ranking changes any more frequently than that.

Cloaking is ALWAYS Black Hat
Industry wisdom may have previously condoned “white hat” cloaking, but Google doesn’t buy it. Matt Cutts of Google stated clearly “White hat cloaking is a contradiction of terms at Google.” This means that any site configured to display different content to visitors than what is available to the bots is in danger of sanctions by Google.

More Changes to Come
Google reports it will continue making changes to the algorithm in the coming months, including rolling it out internationally. We’ll keep an eye on the subject and bring you the news as it evolves.

About Pluff

Windows Hosting for Advanced applications like DNN, SiteCore and Kentico CMS platforms.

Leave a Comment