Recommended Crawl Rate for Bots

You can set your desired bot crawl delay in your robots.txt file by adding this after the user-agent field:

Crawl-Delay: 10

That will cause any legitimate robot to wait 10 seconds between requests as they crawl your site for links.
My recommendation, however, is not to set a crawl delay at all. You want bots like Googlebot and Bingbot to crawl your website as often as possible so your freshest content is in the search results. It’s only when you have an underpowered server with perhaps poorly written code that you want to add a crawl delay because in this case, you don’t want the bots to overwhelm your server with traffic causing it to crash. Googlebot, however, is pretty smart and if it notices increased response times due to the large amount of requests they are serving you, it will back off and make the requests more slowly. I’m unsure how Bingbot works with accidental DOS, but you can set your preferred crawl settings in Bing Webmaster Tools so Microsoft can focus their crawling on non-peak times to keep from overwhelming your server.

In terms of SEO, faster crawling is better, and quality new content is key.
Questions and experiences in the comments!
Cheers,
Luke

Leave a Comment