Ahrefs Bots

At Ahrefs, we operate two primary web crawlers—AhrefsBot and AhrefsSiteAudit—to support our suite of tools and services. The goal of our crawling is to help site owners improve their online presence, while minimizing load on their servers, and ensuring safe, transparent crawling behavior.

Our bots

AhrefsBot

User-agent stringMozilla/5.0 (compatible; AhrefsBot/7.0; +http://ahrefs.com/robot/)

Robots.txt
  • User-agent token in robots.txt:
    AhrefsBot
  • Obeys robots.txt: Yes

  • Obeys crawl delay: Yes


PurposePowers the database for both Ahrefs, a marketing intelligence platform, and Yep, an independent, privacy-focused search engine.

AhrefsSiteAudit

Desktop user-agent stringMozilla/5.0 (compatible; AhrefsSiteAudit/6.1; +http://ahrefs.com/robot/site-audit)

Mobile user-agent stringMozilla/5.0 (Linux; Android 13) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.5359.128 Mobile Safari/537.36 (compatible; AhrefsSiteAudit/6.1; +http://ahrefs.com/robot/site-audit)

Robots.txt
  • User-agent token in robots.txt:
    AhrefsSiteAudit
  • Obeys robots.txt: Yes by default (website owners can request to disobey robots.txt on their sites)

  • Obeys crawl delay: Yes by default (website owners can request to disobey crawl delay on their sites)


PurposePowers Ahrefs’ Site Audit tool. Ahrefs users can use Site Audit to analyze websites and find both technical SEO and on-page SEO issues.

Cloudflare verified

Both AhrefsBot and AhrefsSiteAudit are recognized as verified "good" bots by Cloudflare, a leading web security and performance company.

IndexNow.org

IndexNow partner

Yep—a search engine developed by Ahrefs—is an official participant in the IndexNow protocol, alongside other major search engines. We help website owners instantly notify us when content is updated, ensuring more timely and accurate indexing.

Verification and IP lists

IP addresses

We crawl from publicly published IP ranges. You can pull our IP addresses as IP ranges or Individual IPs. You can find information on how to whitelist our IPs in the help article.

Reverse DNS

The reverse DNS suffix of the IPs hostname is always ahrefs.com or ahrefs.net.

Website status

You can check the status of your website as it's seen by our bots and whether it can be crawled by them:

/

Benefits for site owners

AhrefsBot indexes fresh, accurate information about websites and their content, and how they link to each other. This data is incredibly useful and can be harnessed in many ways:

  • AhrefsBot powers Yep—an independent, privacy-focused search engine. Being included in Yep’s index helps site owners reach a new audience.
  • AhrefsBot feeds data into Ahrefs toolset. Website owners can create a free Ahrefs webmaster account and verify domain ownership to unlock site analytics, including access to in-depth backlink data, website performance metrics, and content change monitoring. Ahrefs also offers a suite of free SEO tools that anyone can use without creating an account.
  • AhrefsSiteAudit powers our Site Audit tool. Site Audit checks websites for technical and on-page issues such as broken links, slow performance, security misconfigurations, and SEO pitfalls. By crawling and rendering pages, we help identify improvements that can boost visibility, loading speed, and overall user experience. Ahrefs also provides the option to run Site Audit for free on verified websites, helping site owners discover and fix technical issues, all without incurring any charges

Policies and commitments

Obeying Robots.txt

Both bots strictly respect robots.txt on both disallow and allow rules, as well as crawl-delay directives. Only verified site owners can allow AhrefsSiteAudit crawler to disobey robots.txt on their site so they can check for issues on the site sections normally disallowed for crawling.

Crawl-delay is strictly followed when requesting HTML pages, ensuring that we do not exceed the specified rate limits. However, it cannot be followed when rendering JavaScript. When our bots render a page, they may request multiple assets (e.g., images, scripts, stylesheets) simultaneously, which can result in more frequent requests appearing in server logs than allowed by the crawl-delay setting. This behavior mimics a real user's experience, as modern web pages often require multiple resources to be loaded at once for proper rendering and functionality.

Caching assets

During crawls, we cache frequently requested files (images, CSS, JS) to minimize repeated fetches, which reduces bandwidth consumption and server load.

Load management

If we encounter non-200 status codes, especially 4xx or 5xx errors, we automatically reduce our crawling speed for that site. This ensures minimal stress on sites that may be experiencing outages or high server load.

Transparent practices

We understand that hosting providers, CDNs, and CMS platforms may want to manage how bots interact with their customers’ sites. Our publicly available IP addresses and user-agent strings let you or your service providers quickly verify legitimate Ahrefs traffic. We’re committed to being transparent about our crawling activities to foster trust and collaboration. If you have any concerns, reach out to [email protected] and we'll do our best to help.

Controlling bots behavior

We provide clear, user-friendly options to control our bots:

Via Robots.txt

To change the frequency at which AhrefsBot or AhrefsSiteAudit visit your site, just specify the minimum acceptable delay between two consecutive requests in your robots.txt file:

User-agent: AhrefsBotCrawl-Delay: [value]

(Where Crawl-Delay value is time in seconds.)

If you want to stop AhrefsBot or AhrefsSiteAudit from visiting your site or a section of it, use Disallow directives:

User-agent: AhrefsBotDisallow: /path-to-disallow/

Please note that AhrefsBot may need some time to pick up the changes in your robots.txt file. This is done before the next scheduled crawl. Verified site owners can allow AhrefsSiteAudit crawler to disobey robots.txt on their site so they can check for issues on the site sections normally disallowed for crawling.

Also, if your robots.txt contains errors, our bots won’t be able to recognize your commands and will continue crawling your website the way they did before. Read more about robots.txt at www.robotstxt.org.

Returning non-200 status codes to reduce crawling speed

You may temporarily reduce the AhrefsBot crawling speed. It may be useful in cases of outages or infrastructural changes when load onto the site should be reduced. Temporarily reducing the crawl rate can be done by returning 4xx or 5xx HTTP status codes for the duration of an outage or maintenance window. Our bot will detect these errors and back off automatically.

Adjusting speed settings in Site Audit

AhrefsSiteAudit bot prevents excessive load on website servers by limiting crawling to 30 URLs per min max. If you're a website owner, you can crawl your own sites at higher speeds if you want to be notified about the site issues faster. To do that, you'll need to verify ownership in the Site Audit tool.

Contacting us

If you have any concerns about how frequently we crawl or if you see suspicious traffic you want to confirm, please get in touch at [email protected]. We’re here to help clarify and resolve any issues.