How do I rate limit google's crawl of my class C IP block?
Posted
by Zak
on Server Fault
See other posts from Server Fault
or by Zak
Published on 2010-04-02T22:32:09Z
Indexed on
2010/04/02
22:43 UTC
Read the original article
Hit count: 425
spider
|crawl-rate
I have several sites in a class C network that all get crawled by google on a pretty regular basis. Normally this is fine. However, when google starts crawling all the sites at the same time, the small set of servers that back this IP block can take a pretty big hit on load.
With google webmaster tools, you can rate limit the googlebot on a given domain, but I haven't found a way to limit the bot across an IP network yet. Anyone have experience with this? How did you fix it?
© Server Fault or respective owner