Learn about the web crawler that powers Ahrefs' Site Audit tool.
Good (Identifies itself, has an official moniker)
Yes by default (website owners can request to disobey robots.txt on their sites)
|Obeys crawl delay|
Yes by default (website owners can request to disobey crawl delay on their sites)
|Desktop user-agent string|
Mozilla/5.0 (compatible; AhrefsSiteAudit/6.1; +http://ahrefs.com/robot/site-audit)
|Mobile user-agent string|
Mozilla/5.0 (Linux; Android 13) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.5359.128 Mobile Safari/537.36 (compatible; AhrefsSiteAudit/6.1; +http://ahrefs.com/robot/site-audit)
|Reverse DNS suffix|
|IP address range|
<a1>Current list</a1>, <a2>API documentation</a2>
AhrefsSiteAudit is a web crawler that powers <a>Ahrefs' Site Audit tool</a>. Ahrefs users can use Site Audit to analyze websites and find both technical SEO and on-page SEO issues.
This bot can crawl any website unless disallowed, and prevents excessive load on website servers by limiting crawling to 1 request per 2 seconds by default. If you're a website owner, you can crawl your own sites at higher speeds and also allow AhrefsSiteAudit crawler to ignore robots.txt. To do that, you'll need to verify ownership in Site Audit tool.
What is AhrefsSiteAudit bot doing on my website?
Does it respect robots.txt file?
How do I control AhrefsSiteAudit crawler on my website?
If you have more questions on our pricing and plans, contact us so we can help.