Though it’s commonly referred to as “Google Panda Penalty”, I think it’d be more appropriate to call it an ‘algorithmic filter’. Yeah, when such a thing partially or wholly wipes out sites from Google’s SERPs, it’s not too hard to understand why people might think of it as some sort of a ‘penalty’, perhaps specifically aimed at their sites.
So, what is Google Panda after all? Why are most webmasters so afraid of it? How to know if your own site is affected by Panda and how could it recover? That’s what I’ll be discussing in this post.
What is Google Panda?
According to Wikipedia,
Google Panda is a change to Google’s search results ranking algorithm that was first released in February 2011.
Google’s Panda algorithmic update is most probably named after Navneet Panda, a Google engineer who wrote the first version of Google Panda. So, it’s not solely named after the black-and-white animal as many would think.
What this algorithmic update did back in its initial days is that it filtered bad sites or low-quality sites out from the Google SERPs, which in turn helped high-quality, unique, UX-focused sites rank higher. This affected many websites which were thin and low-quality in nature and relied entirely on search traffic to generate revenue.
What All Trigger Panda?
However, the Google Panda affects websites for a bunch of other reasons today in addition to boasting low-quality, or thin content. These include (but aren’t limited to):
- Security-related issues / vulnerabilities.
- Slow page-loading times.
- Bad user experience in general.
- Content duplication / re-publication / syndication.
- Complex, hard-to-navigate site structure.
- High pogo-sticking rate.
- A high ads-to-content percentage.
Any one of the reasons stated above, apart from low-quality content may cause the Panda algorithm to filter your site out of the first few Google SERPs.
How to Know if Your Site’s Been Affected by Panda?
There are a few simple things to check if you wish to inspect whether your site is completely free from the wrath of an angry Panda or not.
Take a single content-piece from your site that’s relatively uncommon across the web and perform a Google search for the keywords that are related to it. If irrelevant, or less relevant results appear before your site’s result, chances are, your site’s is being affected by Panda.
A more foolproof way of doing this would be to search for the first three or four words in the title of your content. For example, if the title of a blog post of your site is, “Google Panda is Definitely Named After Navneet Panda”, you could perform a search for, “Google Panda is definitely named”. If it doesn’t return your blog post at the top of the SERP, there’s something to worry about for you, because that starting phrase would be extremely unusual to be used as title by another content piece on the web.
The thing with Panda is that it usually doesn’t completely de-index your site from the Google index, unlike Penguin. While Penguin could’ve completely not let you see your own post at all in Google, Panda would still present it in front of you, albeit not as one of the first few results.
How to Recover from a Google Panda Penalty
Like we already discussed, Panda isn’t a penalty but an algorithmic filter. Though I’m including the word penalty because it’s more famous to the general webmasters as some sort of a penalty.
If your site’s being negatively impacted by Panda, it’s gonna easier to recover from that status than, say, something like Penguin. Because, how quick and how well your site recovers, entirely depends upon itself and not external sites that are linking to it.
First, let’s start with duplicate content — it can include contents copied from another source on the web, numerous pages of your site having the same title, content or meta description etc.
1. Duplicate Content
Why would you post duplicate content in your site in the first place? Even if you have to copy-paste that ESPN article for your own readers to read, make sure you no-index it, so it’s not competing with the original ESPN article on Google. As long as you’re telling Google not to index something, it doesn’t matter if it’s duplicate, low-quality or anything, Google just won’t care.
Things get nasty when you stuff keywords in low-quality articles and try to rank them on Google, that’s exactly what Panda was written to fight.
If your site’s using WordPress, you can use SEO plugins like WordPress SEO by Yoast to set no-index attribute to individual posts. If a lot of duplicate, low-quality content of your site has already been included in Google’s index, and no-indexing them is still taking a lot of time to get them de-indexed, here’s a comprehensive tutorial you can follow to quickly remove pages of your site from Google’s index.
You can also take advantage of robots.txt to stop certain types of pages from being crawled by search engines in the first place. For example, to exclude WordPress search result pages from being indexed by Google, you can add the line below to your robots.txt file:
2. Slow Site Speed
A slow site loading speed isn’t only bad for visitors, but also for you as a webmaster, as it tends to increase the overall time required to perform maintenance-related works on the site. Though website speed doesn’t matter a lot in ranking on its own, when it’s so slow that it pains visitors, Google Panda might come and push your site’s results into the later SERPs.
So, keeping all its benefits in mind, it’s worth trying to reduce the page loading times of your site. Here’s a tutorial I wrote sometime back consisting of 17 ways you can make your WordPress sites faster.
3. Security Issues
Google quite expectedly doesn’t want its users to visit sites that may contain malware or may not be secure enough for most people. Thus, it either de-indexes the affected site or shows a warning beside it on its SERPs. So, this will definitely affect the amount of visitors you get through search engines, unless you fix the security issues and Google no longer finds your site insecure.
The best way to check if your site has security issues is to head over to Google’s own Webmaster Tools and if your site has any sort of security issues, GWT will display a warning message letting you know about them. If it doesn’t, fine, your site’s absolutely alright from a security perspective. But if it does, I’d recommend you to ask your web host to scan the whole site directory, and if that doesn’t fix the issue, you might be better off hiring a website security agency like Sucuri to clean your site up.
4. UX-related Issues
Now, these include high pogo-sticking rates, many above-the-fold ads, and poor user experience in general. You just gotta improve the user experience of your site anyhow to tackle these issues. You may try a sleeker theme, or get one designed specifically for your site by a professional.
A high pogo-sticking rate can also be due to low-quality content. So, in addition to improving the overall user-experience of your site, you should also be careful about the quality of content you’re putting up on your site.
Google Panda isn’t really something you should be really afraid of, unless you run scraper site networks or so-called content farms that exist only to make AdSense revenue. If you’re a webmaster whose running unique, high-quality sites and you think your site may be affected by Panda, don’t be afraid to use this guide and try to identify if your site is really Panda-affected. Even if it is, if you take the right steps based on the points discussed above, it shouldn’t be all that much hard for your site to regain its old glory.
So, what other ways do you recommend for identifying and recovering from the negative impact of Google Panda?