Normally, I setup robots.txt to allow google to walk these sites but unless I can think of a way to tell them to back off to once a week or something I'm about to Disallow them. They're just little private sites and Google ends up being 99% of the traffic. It's silly.
You can set the crawl rate on your site a couple of ways. I think (but am not sure) that Google supports the the Crawl-delay robots.txt directive. You can also set the googlebot rate for 90 days in the Webmaster tools.
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=4862...
BTW, I have not experienced this problem myself.
Cheers, Chris Dean