How to change Googlebot’s crawl rate
Google uses sophisticated algorithms for determining how often to crawl your website. Their goal is to crawl as many pages as possible from your site on each visit without overwhelming your server’s bandwidth.
But if you receive server issue due to Googlebot, you can reduce the Googlebot Crawl rate from Google Webmaster Tools.
Note: If you are not experiencing any server issue, it is best to leave it at default setting. You will even receive notice from the setting page saying that “Don’t limit crawl rate unless Google is slowing down your server.”
Changing Google’s crawl rate only changes the speed of Googlebot’s requests during the crawl process. It does not have any effect on how often Google crawls your site or how deeply the URL structure is crawled. To change Google’s crawl rate on your website:
Log into Google Webmaster Tools (Register first if you don’t have Google Webmaster Account)
- Add your site to Google Webmaster Tools.
- On the homepage of Webmaster Tools, click the website that you would like to adjust the crawl rate.
- Click on the GEAR ICON at the upper-right corner of the page, and select SITE SETTINGS from the drop-down list.
- In the CRAWL RATE section, choose “Limit Google’s maximum crawl rate”
- Change to your prefer crawl rate, and then click save.
The new crawl rate will be valid for only 90 days, and after 90 days it will reset to default which is recommended setting by Google.