ZOMDir > Blog

Friday, 30 August 2013

Taming the bots

Even when nobody adds a link to ZOMDir.com, it is a huge site. The reason is that there are 184 languages, 9 initial subjects and circa 200 initial locations (all countries plus the continents). The result is 184 x 9 x 200 or circa 330000 available webpages.

I was glad when googlebot visited the website, but ... the load of googlebot was to heavy. In practice I will be able to handle circa 30000 pages a day. So it shouldn't be a surprise that googlebot sliced the website down. 

It wasn't easy to find where to set the crawl rate in Google Webmaster Tools. Initial I was looking at the menu at the left while the option is available under the "settings icon" shown at the top right. The next hurdle was that it was not clear for me that this setting is only available for the domain name zomdir.com and not for the subdomains www.zomdir.com or thedarksideof.zomdir.com

After changed the crawlspeed I concluded that in process of time Googlebot is indeed slowing down. I hope that the other bots will follow these instruction in the robots.txt file:
User-agent: *
Crawl-delay: 50
At this moment it seems that it worked to tame the bots. The last days the site continues to respond while several bots are visiting ZOMDir.comI hope it stays that way.