Google Penalties: How to Find, Fix, and Avoid (An Expert Guide)

Nick Churick
Nick is our very own "Communications Manager" at Ahrefs (whatever that means) and coincidentally he's also a pretty skilled writer. So there you have it - he's now a regular contributor to our blog.

Article stats

  • Referring domains 46
Data from Content Explorer tool.
    Google penalties are every webmaster’s worst nightmare. But even if you get one, all is not lost.

    In this post, you will learn:

    • about different Google penalties and filters;
    • how to detect them;
    • how to recover from them;
    • how to make sure your website doesn’t get hit again.

    Thankfully, we have never coped with penalties on Ahrefs’ blog, so we invited Marie Haynes to help us out.

    For those unfamiliar with Marie, she’s an industry veteran with 14+ years experience who happens to be “completely obsessed with understanding Google’s algorithm changes.”

    She has helped numerous businesses recover from “Fred,” Penguin, Panda, and Manual Penalties.

    one example from Marie’s HUGE portfolio of Google penalty recoveries.

    Forbes, Inc, Entrepreneur and several other authoritative publications have all quoted her advice. SERoundtable–and by proxy, the entire search community—honored her in November 2017.

    Bottomline: she knows her stuff when it comes to Google penalties. 😉

    I’ve personally learned a lot from Marie’s blog posts and speeches. So for me, there is no-one more qualified than her to advise on this topic.

    Let’s rock!

    Google Penalties and Filters: What’s the difference?

    There’s a BIG difference between Google Penalties and Filters.

    But I’ve noticed that many people often confuse these terms and use them interchangeably.

    Here’s how Marie explains these terms:

    When I speak of “penalties” I am usually referring to a manual action from Google. A manual action can be given to a site when the site has been manually reviewed by a Google employee. If you have a manual action, you’ll be able to see it if you look in Google Search Console (the old version, not beta) under Search Traffic –> Manual Actions.

    It will look something like this:

    If you don’t have any message like that in the manual actions viewer, then you don’t have a manual penalty.

    There you have it—Google is not only about their fantastic algorithms and learning mechanisms. They have a vast army of quality evaluators who estimate quality and penalize websites.

    So when Google’s machine algorithms signalize that there’s something “suspicious” about a website, or when people report that website to Google, it can be manually reviewed, with relevant actions taken.

    Users can report websites to Google when they see pages with spam, paid links or malware in search results.

    However, even if you don’t have a manual penalty, you still could be affected by an algorithmic filter.

    Here’s what Marie had to say:

    A filter is a part of the main algorithm that can cause your site to be algorithmically suppressed. For example, the Panda algorithm can act like a filter. If your site is deemed lower quality by Panda, then the filter can act like an anchor that holds your site down. This anchor can make it really hard for you to rank well.

    In the past, Panda and Penguin were algorithmic filters that would run periodically. Google would update the algorithm and then, if Panda or Penguin thought your site had issues, they would put something like an invisible flag on your site that would cause it to be suppressed. Over the last few years though, much has changed in how Panda and Penguin run. They are both baked into the algorithm and run continuously now. Also, Penguin no longer suppresses sites. As such, it gets difficult to talk about either of these as a distinct filter.

    There are likely also additional types of filters now that aren’t Panda or Penguin. For example, it’s possible that the general core quality algorithm could consider parts or all of your site as lower quality and cause a ranking suppression. A good example of this is for sites that are lacking in E-A-T (Expertise, Authoritativeness and Trust). If you are, say, a medical site that is offering medical advice, but you don’t have real-life E-A-T, then Google may put a filter on the site that causes a demotion.

    It’s worth pointing out that the E-A-T acronym was not created by Marie herself, but rather by Google in their Search Quality Evaluator Guidelines. (See page #32)

    Expertise, Authoritativeness, and Trustworthiness (E-­A-­T) is what shapes the quality of a web page when reviewed by Search Quality evaluators.

    Sidenote.
    I believe that evaluator’s activity is further processed and used as a part of their machine learning—this hasn’t been proven, though.

    But how do you know if you’re suffering from an algorithmic filer?

    Here’s Marie’s take:

    If you are negatively affected by an algorithmic filter, there is no notification from Google on this. The best way to determine this is by looking at Google organic traffic and seeing if you have a drop that coincides with a known or suspected Google algorithm update. But even then, it’s not guaranteed that this update was the cause of your drop. Algorithmic filters are really hard to diagnose.

    Editor’s note

    With Penguin 4.0 in full swing, identifying algorithmic penalties may be a little harder than it was in the past.

    But the “Panguin” tool should probably be your first port of call.

    This nifty tool connects to Google Analytics and overlays known algorithmic updates over a graph of your organic search traffic.

    an example of the Panguin tool in action.

    Each line represents a known algorithm update, which makes identifying potential issues much easier.

    If you don’t use Google Analytics, you can always perform a manual check using a combination of the organic traffic graph in Site Explorer and Marie’s list of algorithm updates.

    Joshua Hardwick
    Joshua Hardwick
    Head of Content

    Learning more and more about Google’s penalties, I wondered if the number of penalized sites had gone up or down over the past years.

    In my opinion, the number of manual actions that Google gives out has reduced greatly. This is because Google is getting much better at dealing with these things algorithmically.

    Conversely though, the algorithm is getting much better at identifying quality content, so it can often look like a site is penalized. For example, in the past, you could take a mediocre site and if you built enough links to it it would rank really well. But now, if you do that Google is smart enough to recognize whether the links are truly earned or self-made and whether the content really is the type of content that people want to recommend.

    More and more, the sites that are doing well are absolutely amazing sites that truly are the best of their kind. Sure, there are still some people who are able to spam their way to the top, but it is getting much harder to do that!

    What Penalties (Manual Actions) Can a Website Get?

    There are two types of actions that can be displayed on the Manual Actions page:

    1. Sitewide matches, that that affect an entire site;
    2. Partial matches, applicable to an individual URL or a section of a website.

    Every Manual Action notification is accompanied by “Reason” and “Effects” information.

    The list of common manual actions includes:

    • Hacked site;
    • User-generated spam;
    • Spammy freehosts;
    • Spammy structured markup;
    • Unnatural links to your site;
    • Thin content with little or no added value;
    • Cloaking and/or sneaky redirects;
    • Unnatural links from your site;
    • Pure spam;
    • Cloaked images;
    • Hidden text and/or keyword stuffing

    You can read more about the Manual Actions report in this Search Console Help Article.

    Can a website get a Manual Penalty without any notification in GSC?

    Most likely, the answer to this is no. If a site has a manual action, you really should always have that reflected in the manual actions section of GSC. Take note though, that you may not always see the penalty in the “messages” section, especially if you are a newly added owner to the GSC profile. You will still see the manual action in the manual actions section though.

    I have had a couple of really unusual cases where I felt that there was some type of manual suppression on a site that was not reflected in the manual actions section of GSC. I want to say though that I think that this is really rare.

    In one situation, the site could not rank for its brand name even though it was a recognizable brand. That’s usually a sign of either a significant manual action or serious technical issues. I wrote to a contact of mine at Google and within 24 hours they were on page 3 after being on page 11 for brand terms for months. That case is complicated though as we did also find technical issues to fix. Within two weeks of fixing those, they popped up to #1. So, it’s hard to say whether there was a hidden penalty or whether the technical issues were to blame. I do think it was a heck of a coincidence though that they jumped from page 11 to page 3 after my inquiry.

    In another case, I contacted a Google employee about a site that I really felt was being suppressed unfairly. The next day it was as if the suppression was lifted. Can you tell on the image below when that started to happen?

    captured from Ahrefs Site Explorer

    These are really rare though. If you feel like your site is being suppressed by Google, it is extremely unlikely that there is a hidden penalty that you are unaware of. In almost every case there are other quality issues.

    If you were hit by a quality update in 2017 and beyond, I’d highly recommend looking at Google’s Quality Raters’ Guidelines for clues as to what to fix to improve your quality.

    What causes most penalties today?

    Considering the number of various Manual Actions Google can take, I couldn’t help but wonder which were the most common?

    From Marie’s experience, more often than not these are attempts to manipulate Google search results.

    Most penalties these days come from over-aggressive attempts at optimization. If you get a manual penalty for unnatural links, it is usually because you are buying links or creating link schemes on a large scale.

    Sidenote.
    You can learn about all the link schemes violating of Google’s Webmaster Guidelines in this Search Console Help article.

    Another type of penalty that can be given is a thin content penalty. These are usually given when a site has a huge number of thin doorway pages that are only there for SEO reasons. Again though, most of these are simply caught as lower quality pages by algorithms now, so I am not seeing as many thin content penalties as I did in the past.

    I would say that almost all Google penalties now are given because the site owner was trying too hard to manipulate Google. Five or six years ago I did see a lot of penalties that came as a result of good, honest business owners hiring poor SEO companies who built unnatural links. But, now, most of that type of link is just ignored by Penguin. As such, if you get a link penalty, you usually know that you deserved it.

    The key takeaway? Stop doing shady stuff, folks! 🙂

    Can negative SEO attacks lead to Manual Actions?

    What if webmasters have nothing to do with the unnatural links to their website?

    What if their competitors built those spam links as a negative SEO attack to demote a competing site?

    Marie, take it away:

    In my opinion, it is rare that negative SEO attempts will lead to a manual action. If a site does get a manual action after a wave of negative SEO links are pointed at them, it almost always turns out that the site had also been involved in a lot of their own link building as well.

    This is always a controversial subject. I have reviewed hundreds of cases where the site owner felt they had been a victim of negative SEO. In almost all of these cases, I did not feel that negative SEO attempts had hurt them.

    Every time I say this, a blackhat will pop up and swear that they have taken sites down with negative SEO. It’s really hard to prove though.

    I would say that for most sites, if you are being attacked by an onslaught of spammy links, you can just ignore them. However, I would still disavow links like this if the following is true:

    • If you have your own history of self-made links for SEO purposes in the past.
    • If you are in an incredibly competitive vertical like casinos, porn or pharma. I believe there are tougher algorithms in place in these niches that can make negative SEO a little bit more effective.
    • If you see a drop in traffic that coincides with the onslaught of links and there is no other explanation for the drop.

    So if you do some black-hat SEO, negative SEO from your black-hat competition might bring unnecessary attention to your website. Isn’t that another signal to quit black hat?

    TL;DR: it’s unlikely. 😉

    Are there Lifetime Penalties?

    I really wondered if each and every penalty can be lifted.

    Can Google hold a grudge, Marie?

    I have seen cases where a domain name is purchased and the new site is unable to rank because the domain was previously given a pure spam penalty. Those are usually pretty easy to fix with a reconsideration request though.

    Google has said that they do not blacklist domains. If you have had a penalty in the past, and you do the proper work to clear up the penalty, then your site should not be suppressed any more.

    However, in most cases, when you clean up the site, you’re removing things that were artificially propping up your rankings before. If you previously were ranking #1 on the power of paid links, removing those paid links is not going to make you jump up to #1 again. If you previously were ranking in multiple cities because you created thin doorway pages, you are not likely to be able to recover those rankings following a thin content penalty.

    As such, sometimes it looks like a domain is forever suppressed by Google, but really, what has happened is that Google has closed the tricks and loopholes that allowed it to rank well before.

    I did some research and came across John Mueller’s AMA session on Reddit where he took time to answer tons of interesting questions. Here’s what he says:

    There’s no “reset button” for a domain, we don’t even have that internally, so a manual review wouldn’t change anything there. If there’s a lot of bad history associated with that, you either have to live with it, clean it up as much as possible, or move to a different domain. I realize that can be a hassle, but it’s the same with any kind of business, cleaning up a bad name/reputation can be a lot of work, and it’s hard to say ahead of time if it’ll be worth it in the end.
    John Mueller
    John Mueller, Webmaster Trends Analyst, Google

    Apparently, there are some extreme cases where it’s easier to register a new domain than clean up the mess associated with the old one.

    How to Identify If Your Site is Filtered or Penalized?

    When you need to learn if your website had been filtered or penalized, Marie recommends looking at just a few core factors:

    I’d recommend the following:

    • Check GSC –> Search Traffic –> Manual actions to make sure that you don’t have a manual penalty.
    • Make sure your Google Analytics tracking code is working properly.
    • If you are seeing drops in rank checkers, check rankings manually on an incognito browser to see if the rankings truly are down.
    • Make sure you’re looking at Google organic traffic and not all traffic. I have had several cases where a site has contacted me because of a traffic drop, when in reality all that happened was they stopped running PPC ads.

    Losing organic search positions and, consequently, traffic, does not necessarily indicate that your website was filtered algorithmically.

    You could have simply been outperformed by your competitors in organic search results.

    It is so common that a site may think that they’re being penalized, when really the issue is that a competitor is starting to beat them. And often, it’s really hard to compare your own site to competitors objectively.

    For example, take the March 9, 2018 algorithm update. Google confirmed that this was an update that rewarded high quality sites. It wasn’t a demoting algorithm update. As such, if you saw drops on March 9, it’s not that you were demoted as low quality, but rather, other sites were promoted above you. […]

    For reference, here is more information on the algorithm update Marie is referring to.

    Sorry to interrupt, Marie!…

    […] But even then, you’d want to be critically assessing your own site to figure out how you can make it the best of its kind.

    When we look at sites that have seen drops, we like to look at who is gaining amongst their competitors. Sometimes it is obvious that one particular site is suddenly seeing gains. If that’s the case, then we look at it and say, “Why would I prefer this #1 ranked site over the site that used to be #1?” Usually the answer to that question is obvious.

    If there is no obvious competitor that is emerging as the winner, but rather, everyone seems to be beating you, then this could be a sign that your site (or perhaps your page) is being demoted due to low quality.

    It’s usually not straightforward though whether you have problems. And ultimately the answer is the same in every case: DO EVERYTHING YOU CAN TO IMPROVE UPON QUALITY. It’s rare these days that you can fix a traffic drop by finding and fixing a smoking gun. In most cases, you’re not going to be able to remove a few pages or do a disavow or tweak a few things and see a return to good rankings. In order to recover from a quality hit you need to make extensive improvements.

    How to Analyze Your Backlink Profile for Possible Issues

    Unnatural links” to a website is one of the most common causes of penalties.

    If you see this notification, you must start with your backlink profile analysis.

    Dominance of irrelevant or over-optimized anchor text and loads of backlinks from poor-quality websites is what you must pay attention to.

    Here’s an oversimplified process from Marie Haynes, where she recommends starting with an anchor text profile analysis :

    I love looking at links with Ahrefs. I usually start out by looking at the anchor text profile. Most commonly, you’ll see that most of the links are URL or brand anchored. But, if a site has been heavily involved in manipulative link building, you’ll usually see a lot of keyword anchors:

    Anchors report in Ahrefs Site Explorer

    It’s important to note though, that this is just a generalization. If your site has a lot of keyword anchored links, that can be perfectly ok. But, if you have them because you made those links, then that is a problem.

    She then recommends following this with a backlink profile analysis:

    Next, what I will usually do is sort site’s links by looking at one link from each domain and then clicking on “dofollow”:

    (Note: The word “dofollow” is a bit of a misnomer. “Nofollowed” is a thing, but “dofollow” is really just a regular link.)

    I’ll then export this list and upload it to a tool that I created that checks the links against my blacklist. My blacklist is a personal list that I have created over years of auditing hundreds of thousands of links. I have had many requests from people who would like me to add their own disavow decisions to this list. But, I have found that I don’t trust anyone else’s decisions on disavowing. As such, this blacklist contains domains that I have visited where I know I would almost always want to disavow them.

    If you want to check a domain against my blacklist, you can do so here. I do also have a tool that is a paid tool that allows you to upload the full list of linking URLs or domains for a site and see which of these are on my blacklist. I upload the links from Ahrefs Site Explorer into this tool and usually can get a good sense as to how serious the link problem is for a site.

    If I decide to do a full link audit for a site, then I have another proprietary tool that I programmed in which I upload links from several sources including Ahrefs, GSC, Moz and Majestic and it creates a spreadsheet for me to use for auditing. The spreadsheet shows which of these are on my blacklist, which I can ignore, which are already disavowed and much more.

    I should note, however, that I don’t do nearly as many link audits as I used to do in the past. Now that Penguin ignores links, if you disavow, you’re asking Google to ignore links that they are already ignoring. But, I do believe that there are other algorithms that could demote a site that has a large number of unnatural links.

    I will occasionally see a situation like this in which we disavow and then see organic traffic gains. For this site, we disavowed over 1500 domains from which the company had made links for SEO purposes:

    It’s possible that the disavow was the cause of this organic traffic gain as it wasn’t a seasonal gain. But, we also were working on other quality issues too, so it’s hard to prove.

    I’m with Marie on this one—when looking at your backlink report in Ahrefs, set a filter to “dofollow,” open the pages that link to your website, and estimate them visually.

    Websites created for the sake of linking-out are usually easily noticeable. Content on their pages won’t make much sense, images will be a total mess and they will barely rank for at least a few keywords on Google.

    Editor’s note

    We recently added “Traffic” and “Keywords” columns to our Backlinks report to give you better insights on the linking pages.

    But remember: Low Domain Rating and poor organic traffic do not always indicate a website’s low quality.

    It could be that the website is new, and it may grow into a strong one over time.

    But when a linking website has low DR, ugly design and UI, low-quality content, and no organic traffic, this should raise flags.

    If you’re not happy with the quality of the linking page or you know that the link is not natural, add it to a disavow list in Ahrefs.

    You can easily export this list and upload it to Google’s Disavow Links Tool. (Learn how to do this here.)

    Joshua Hardwick
    Joshua Hardwick
    Head of Content

    Taking Actions

    To lift a penalty from your website, you must take actions to rectify the problems specified in GSC Manual Actions message(s).

    Unnatural links” to your site is the only penalty that has its roots outside your website.

    To rehabilitate, you must get rid of those links.

    And to do that, you must send link removal requests to the websites. Simply disavowing these links may not suffice.

    Here is Marie’s take on this process:

    I only do link removal requests these days if a site has a manual action. In that case, you need to show Google evidence that you have worked hard to clean up past link building tactics. But, if you are worried about links hurting you algorithmically, then disavowing is good enough. The exception would be for links that you easily control. If you can easily remove the link, then do that rather than disavowing.

    But what about the thorny issue of submitting a reconsideration request?

    In short, make sure to document every step you take to fix the issues mentioned.

    You have to demonstrate your effort to address all the issues on your website when you send a reconsideration request, as well as your results.

    Here’s what Google has to say on the matter:

    You should make good-faith effort to remove backlinks before using the disavow tool. It can also be helpful to document the effort involved in removing those links. Simply disavowing all backlinks without attempting to remove them might lead to rejection of your request.

    If you can’t remove a link to your website because you cannot reach the webmaster, disavow the linking page or domain, but explain why you did so in your reconsideration request.

    If the penalty resulted from thin or scraped content, provide evidence of your improvements. Demonstrate what content you took down and what you added instead.

    Although you have more than one chance to send the reconsideration request, double check that you’ve addressed all the problems identified before you send it.

    Remember that your website will be reconsidered by humans, not by machine algorithms.

    You have to be convincing. “Manual Actions” will be lifted manually.

    Final Thoughts

    As you saw from this post, websites mostly suffer from Google’s penalties and filters because of low-quality content and shady SEO techniques.

    If you want to make your website bulletproof, make sure it meets Google’s Quality Guidelines and its link profile is natural.

    Apart from that, monitor your site for hacking to remove hacked content as soon as possible and prevent user-generated spam on your site.

    And if you can’t handle a penalty on your own, invite an expert like Marie.

    Nick Churick
    Nick is our very own "Communications Manager" at Ahrefs (whatever that means) and coincidentally he's also a pretty skilled writer. So there you have it - he's now a regular contributor to our blog.

    Article stats

    • Referring domains 46
    Data from Content Explorer tool.

    Shows how many different websites are linking to this piece of content. As a general rule, the more websites link to you, the higher you rank in Google.

    Shows estimated monthly search traffic to this article according to Ahrefs data. The actual search traffic (as reported in Google Analytics) is usually 3-5 times bigger.

    Get notified of new articles

    45,522 marketers are already subscribed to Ahrefs blog. Leave your email to get our weekly newsletter.