Keyword Difficulty: How to Determine Your Chances of Ranking in Google

Tim Soulo
Tim is the CMO at Ahrefs. But most importantly he’s the biggest fanboy and the truest evangelist of the company.
Article stats
  • Linking websites 396
  • Tweets 420
Data from Content Explorer

Shows how many different websites are linking to this piece of content. As a general rule, the more websites link to you, the higher you rank in Google.

Shows estimated monthly search traffic to this article according to Ahrefs data. The actual search traffic (as reported in Google Analytics) is usually 3-5 times bigger.

The number of times this article was shared on Twitter.

    Finding a bunch of keywords that you want to rank for is easy. The hard part is figuring out what it’ll take to rank for those keywords (i.e., “keyword difficulty”), and then using that information to prioritize and plan your SEO strategy.

    Nobody knows the exact formula Google uses to rank web pages. Even if they did, it wouldn’t change the fact that some keywords are more difficult to rank for than others. That’s what makes the concept of “keyword difficulty” one of the key challenges in SEO.

    Now, some of you may be aware that we (Ahrefs) have a metric called Keyword Difficulty, which thousands of SEO professionals rely on when performing keyword research.

    But, at the same time, we’ve found that many users expect way too much from this metric.

    So, in this article, I’d like to explain the general concept of keyword difficulty (i.e., what seasoned SEOs pay attention to when trying to estimate their chances of ranking in Google) and show you how Ahrefs’ Keyword Difficulty metric is helpful.


    Don’t confuse keyword difficulty with “keyword competition” found in the Google Keyword Planner (see screenshot below). Keyword competition shows you how many advertisers are bidding on ads in the search results for a given keyword. Keyword difficulty shows you how hard it is to rank for that keyword in organic search.

    01 Google keyword competition

    How to determine the ranking difficulty of a keyword

    Most SEO professionals judge the “ranking difficulty” of a keyword by analyzing the pages that already rank in Google for known factors that correlate with rankings.

    This essentially boils down to 4 main attributes:

    1. Content of the page
    2. Searcher intent
    3. Links from other websites
    4. Domain/website authority

    I’ll expand on each of these contributing factors below. But before I do that, there’s something you need to consider.

    There are quite a few different “schools of thought” in SEO, which means that some SEO professionals will disagree with some of my statements and ideas below. I have absolutely no problem with that and encourage you to examine any contradicting arguments you may come across carefully. It may be the case that a different school of thought resonates more with you.

    That said, let’s take a closer look at the aforementioned contributing factors…

    1. Content of the page

    Google doesn’t rank irrelevant pages. But how do you make your page relevant to a target keyword?

    I have a better question for you:

    How do you make your page NOT relevant to a target keyword?

    Case in point: we recently published an article about 301 redirects with the aim of ranking in Google for “301 redirects.” Because the overall topic of the post and the target keyword align so closely, there was literally no way to write this post while simultaneously making it irrelevant to the target keyword.

    See where I’m going with this?

    If you write an article and target a specific keyword, it will inevitably be relevant to that keyword.

    Yet, most on-page SEO advice states that you need to mention your target keyword in the following places to increase your page’s relevance in the eyes of Google:

    But isn’t that a given? Aren’t you likely to do those things anyway?

    For example, I’m 500 words into this post about “keyword difficulty” and I’ve mentioned that keyword seven times already, despite making zero effort to do so. It happened naturally.

    Years ago, things were different. You could indeed fool Google with these kinds of tricks. If you wanted to outrank your competitor, mentioning your target keyword more times than them and stuffing your title tag with keyword variations was the name of the game.

    But those days are long gone. Google is now smart enough to pull tricks like this one:

    guest writing Google Search

    Here, the #1 search result for “guest writing” doesn’t have that keyword in its title.

    But, wait for it…

    The Ultimate Guide to Guest Blogging

    There’s also not a single mention of the keyword “guest writing” anywhere on that page!

    Clearly, Google is smart enough to understand synonyms and figure out that “guest writing” and “guest blogging” are the same thing.

    So, if you still see the absence of the target keyword in the title tag across top-ranking pages as a sign of low keyword competition, then think again.

    When analyzing a keyword, you shouldn’t pay much attention to how many times and where your target keyword is mentioned on a page. Instead, you should dig into the content of the top-ranking pages to determine whether they’re written by someone with good knowledge and understanding of the topic at hand, or churned out by a $3/hr freelance copywriter.

    The quality of your content and value that it brings to readers is what gives you a competitive edge, not “strategic keyword placements.”

    Tim, are you trying to say that Google is smart enough to read the actual content of a page and identify if a person with domain expertise wrote it or just a random copywriter? 

    No. I’m pretty sure Google can’t do that. Otherwise, they wouldn’t allow a “lorem ipsum” website to outrank legit sites.

    But I do believe they have enough factors baked into their algorithm to allow them to “guesstimate” the depth, authority, and trustworthiness of a given piece of content.

    What about things like LSI and TF*IDF?

    At this stage, some of you may be wondering about our stance on those on-page SEO tools that claim to help you win in the search results by sprinkling the “right” words into your content in the right quantities.

    Most of these tools claim to use smart-sounding technologies like LSI or TF*IDF to help do this.

    So here it goes…

    Google almost certainly uses a word vector approach (for RankBrain), but there’s no evidence to suggest that they use LSI. In fact, given how LSI works, all evidence points to the contrary. So there’s not much point in using those LSI keyword tools to generate keywords to include in your content.

    As for TF*IDF, that one’s a bit more likely, but still…

    Here at Ahrefs, we don’t recommend obsessing over fancy-sounding hacks like this. We believe it makes more sense to focus on crafting excellent content and a fantastic user-experience above all else.

    In other words, we don’t believe there’s a way to quantify the “relevance score” of your content and win in the search results by improving that score, especially given the “searcher intent” factor that I’ll discuss in just a moment.


    When analyzing the top-ranking pages, make an effort to assess the overall quality and depth of the content as opposed to looking for traditional indicators of poor optimization (e.g., keyword in the title, etc.). Your job is then to create a better page than those that already rank, not a “better optimized” one.

    Furthermore, don’t try to measure the relevance of page with any kind of “score”—whatever methodology or technology you’re using for that, Google almost certainly isn’t using it.

    2. Searcher intent

    Here’s our ranking history for the keyword “backlink checker” as seen in Ahrefs Site Explorer:

    Organic keywords for ahrefs com backlink checker

    Back in February 2017, we published a landing page with the aim being to rank for the keyword “backlink checker.” The content of that page was perfectly “optimized” for its topic, as we described the awesome backlink research features that we have in Ahrefs.

    A couple of months after publishing (and thanks to some smart internal linking), our page climbed to position #5 for its target keyword.

    We were ecstatic!

    But that didn’t last long. Just a few days later Google started pushing us down, and quite soon we found ourselves on the second page of search results (which is the best place to hide dead bodies).

    Being the “professional SEOs” that we are, we started optimizing that landing page with every trick in the book. We:

    • Improved the copy of the page;
    • Improved the load speed;
    • Improved “mobile friendliness;”
    • Worked on acquiring quality backlinks;
    • Etc.

    And still, it took us almost a full year to push back to position #6.

    Then… it refused to move any higher.

    That’s when we realized that we were digging in the wrong direction all along.

    Why? Because despite polishing the more “traditional” SEO factors to perfection, the way our landing page addressed search intent was lacking. It did nothing more than advertise our paid 7‑day trial.

    Ahrefs SEO Tools Backlink Checker SEO Report

    Conversely, all the pages that ranked above us were free backlink checkers that allowed users to check backlinks for free. No registration or credit card details required.


    That’s when we decided we could better serve searcher intent by launching a completely free backlink checker tool—one that was way better than those currently on offer from the sites that outranked us.

    So that’s what we did:

    Free Backlink Checker by Ahrefs Check Backlinks for Any Website

    Within a week, our page jumped to the #1 position, and it’s been there ever since! (It’s been six months now.)

    But that’s not all…

    Look at the increase in organic traffic and keyword rankings after we nailed searcher intent:

    Overview ahrefs com backlink checker on Ahrefs

    That’s an almost 6x improvement!

    So how does this work?

    Google seems to have some means of figuring out whether searchers are happy with the search results. Precisely how they do this is unclear, but things like pogo-sticking, dwell time and potentially bounce rate are likely all part of the equation.

    Either way, the critical point is this: As soon as they see that users “like” a specific page more than others, they will rank that page at the top.

    In our case, it seems like the searchers are so overwhelmingly satisfied with our free backlink checker tool that Google keeps ranking our page for more and more related keywords, increasing the total search traffic that our page gets over time.

    I’m quite confident that now we could easily strip 90% of the content from that landing page and continue to rank at the top. Users seem to only care about finding a free backlink checking tool, and not our musings about how great our tool happens to be.


    Analyze the top-ranking pages for your target keyword to decide if you can offer searchers something better than what’s already there. If you can’t do that, then that keyword is effectively too tricky for you, no matter what the SEO tools say.

    Editor’s Note

    An excellent way to learn whether Google is happy with the current search results is to look at the SERP position history.

    Here’s SERP position history graph that Ahrefs Keywords Explorer shows for the keyword “SEO tools”:

    seo tools serp history

    Those same five pages have been ranking at the top for quite a while. They’ve hardly shifted.

    Compare that with the SERP ranking history for the keyword “SEO forums”:

    seo forums position history

    The pages have jumped in and out the top 10 for almost a year. And only recently, Google seems to have figured what kind of search results most satisfy their users.

    Some SEOs see chaotic SERPs as an opportunity to rank:

    Reason being, Google clearly isn’t happy with the current results as it chooses not to rank any one of them at the top for more than a few days at a time. So, if you can crack search intent, the ranking is yours for the taking.

    But, while this is the case, it can also indicate a difficult to crack SERP.

    That’s because Google has no clue what searchers are after, and most likely, neither do you. It may also be the case that search intent is continually changing, in which case ranking long-term will be next to impossible.

    Joshua Hardwick
    Joshua Hardwick
    Head of Content

    3. Links from other websites

    Finally, we move from the more abstract concepts like “quality content” and “searcher intent” to something much easier to measure: backlinks.

    Links from other websites are an essential part of Google’s ranking algorithm (see my post about PageRank for more details).

    Google sees them as “votes” that mean your page deserves to rank high.

    But, of course, Google’s algorithm doesn’t only count the raw number of websites linking to a page—it’s way more sophisticated than that. The “quality” of the linking page also matters, along with things like:

    • The number of other sites to which the linking website links
    • How deeply buried in the website’s structure your link happens to be
    • The actual context and anchor text of that link
    • etc.

    Unfortunately, all these variables make it very hard to accurately calculate the so-called “link value” of a page, as Google sees it.

    Having said that, as a general rule, the more links you get to your page from other websites, the higher your page will rank in Google.

    We know this because we studied it and saw a clear correlation.

    page level backlink factors image

    Do you see how the raw number of backlinks has a slightly weaker correlation than the number of unique referring domains? (i.e., backlinks from individual websites)

    That means you’re better off getting ten links from ten different websites than from a single site (all else being equal).

    And that’s why we calculate Keyword Difficulty (KD) by taking a weighted average of the number of referring domains to the top-ranking pages.

    Keywords Explorer

    On the screenshot above you can see that the keyword “backlink checker” has a KD score of 74, which is considered to be “Super hard.” And right below this number, we have a hint that says:

    We estimate that you’ll need backlinks from ~247 websites to rank in top 10 for this keyword. 

    That is a very carefully crafted phrase.

    Let me break it down for you:

    • We estimate…”—This means that we’re not 100% confident in the statement that follows. It’s just an educated guess.
    • …you’ll need backlinks from ~247 websites…”—The critical part here is that “~” sign right before the number. It means “approximately.” So this number should not be treated as a precise one.
    • …to rank in top 10 for this keyword.”—We estimate that you will rank in the top 10 if you can convince that many websites to link to your page. But we can’t promise you’ll rank #1 or even in top 5.

    So why are we so cautious with that wording?

    Well, we know that if you can get as many links as the current top-ranking pages, then you’re likely to rank somewhere amongst them. But once you reach that first page of search results, Google starts to look at other “ranking signals” to determine the ranking position of your page (see my story about our “backlink checker” landing page above).

    Creating a keyword difficulty score that accurately predicts the chance of a #1 ranking would be about as easy building your own search engine.

    Which is something that our founder and CEO actually announced recently:

    But until we do this (which will no doubt take us some time), our Keyword Difficulty metric is likely to stay purely link-based.


    Numerous case studies from the past few years show that backlink factors have a higher correlation with rankings than anything else. That’s why our Keyword Difficulty (KD) score is based solely on the average number of linking websites to the current top 10 ranking pages, which makes it super easy to comprehend.

    Still, you should only use this number as the first layer of your research because searcher intent and content quality often play a substantial role when it comes to rankings too.

    Editor’s Note

    Since we’re already talking about Ahrefs’ KD metric, let me briefly address the most common question we get about it:

    Which Keyword Difficulty should I target with my website?”

    Ahrefs’ KD metric is basically a proxy to an average number of backlinks across the top 10 ranking pages. Specifically:

    KD 0 = 0 Ref. Domains
    KD 10 = ~10 Ref. Domains
    KD 20 = ~20 Ref. Domains
    KD 30 = ~35 Ref. Domains
    KD 40 = ~55 Ref. Domains
    KD 50 = ~80 Ref. Domains
    KD 60 = ~130 Ref. Domains
    KD 70 = ~200 Ref. Domains
    KD 80 = ~350 Ref. Domains
    KD 90 = ~800 Ref. Domains

    So the answer to this question comes down to how many backlinks you can acquire to the pages on your website. To get a rough sense of this, paste your domain into Ahrefs Site Explorer and go to the “Best by links” report:

    Best by links ahrefs com blog on Ahrefs

    On the screenshot above, you can see that our blog homepage has backlinks from 957 different websites (referring domains), but the five most-linked articles have 453–670 backlinks.

    So it seems like we can safely target keywords with KD score of up to 90 on our blog, right?

    Not quite. Those are our five most-linked articles. It took us many years of blood, sweat, and tears to generate that many backlinks to each of those posts. If you scroll down that report, you’ll see that the vast majority of our articles have fewer than 100 backlinks. So I’d say that the KD score that we can comfortably target is anything up to 50.

    Do this exercise with your website and see how many backlinks your pages have.

    Joshua Hardwick
    Joshua Hardwick
    Head of Content

    4. Domain authority

    This contributing factor is likely the most controversial of all.

    While the majority of SEO professionals believe that Google has some kind of domain-wide quality metric that influences every page on a given website, Google’s representatives consistently manage to evade giving a direct answer when asked about this.

    Here’s a good example:

    Google’s John Mueller replied to my tweet about domain authority, saying that while they don’t use this metric per se, they do sometimes “look at things a bit broader.”


    But if you ask me what the Ahrefs team thinks about the influence of “domain authority” on a page’s ranking position, I’ll give you three statements:

    1. We believe that Google sometimes gives preference to pages on “strong” websites in the search results. But it’s tough to tell whether this is the result of a clear preference for high-authority sites on Google’s part, or whether there’s some other indirect cause—like a preference for results from well-known brands for some queries.
    2. We believe in the effectiveness of internal linking for “strong” websites. In other words, we think that websites with “powerful” pages can funnel some of that “power” to other pages via internal links, thus helping them to rank higher in Google.
    3. We believe that a page on a “weak” website will outrank one on a “strong” website if it has more high-quality backlinks in comparison.

    Here at Ahrefs, we have our own “domain authority” metric called Domain Rating (DR), but it is based purely on links. The way Google measures “domain authority” likely includes many factors other than links.

    Speaking of link factors…

    A few years ago, we ran a correlation study and plotted page-level backlink factors next to domain-level backlink factors:

    05 page authority VS domain authority

    Conclusion: Page-level factors correlate much more strongly than domain-level factors.

    That kind of reconfirms what John Mueller said in that tweet I shared earlier—Google tries to go as granular as possible with their metrics.


    Websites with “high authority” are essentially just brands that people know and trust, and therefore click.

    For example, if you search for news, then you’re likely to click results from BBC or CNN over a random blogger. If you search for SEO, then you’re likely to click results from Ahrefs, Moz, SEJ, etc. Google is clearly rewarding the search results that people are more likely to click.

    So, it’s going to be more challenging to rank in SERPs that feature a lot of “big names” in your niche, but not impossible. Our data says that it’s possible to outrank pages on high-authority domains with “strong” pages (i.e., those with lots of high-quality backlinks) on “weaker” domains. That is unless searchers have zero motivation to click on the site in the results over the familiar brands that they see ranking alongside you.

    Don’t shy away from high-difficulty keywords

    Most articles about keyword research tell you to avoid high-difficulty keywords in favor of those with high search volumes and low Keyword Difficulty scores.

    That makes total sense. But the reality is that the number of such keywords in any particular niche is incredibly low. So, once you fail to find any of them, you might conclude that keyword research simply doesn’t work for you.

    That’s the wrong attitude.

    Keyword difficulty isn’t meant to discourage you from ranking for a particular keyword. It is intended to help you estimate the resources required to rank for it.

    If ranking for a keyword is vital to your business, you have to pursue it—no matter what it takes.

    Five years ago, when I inherited the Ahrefs blog, I didn’t see a lot of “easy” keywords for which we could rank. The best ones were already “taken” by the other big names in our niche.

    But look at the growth of our search traffic over the past five years:

    ahrefs blog traffic growth

    We didn’t achieve that level of success by cherry-picking low-difficulty keywords. We did the opposite. We wrestled with the big guys for highly-competitive keywords right from the very start.

    Did rankings come in thick and fast? Not at all. We’re not an overnight success story by any stretch. But that was never our goal.

    Our goal was to create the best SEO blog in our industry, and we knew that would take time.

    In SEO, short-term thinking is the greatest enemy of long-term rankings…

    … so don’t shy away from those seemingly competitive keywords that have the potential to skyrocket your business growth. 🙂

    • Linking websites 396
    • Tweets 420
    Data from Content Explorer