It is interesting to note that you could be the unfortunate recipient of a DOS attack from Google/Bing/Yahoo simply because someone creates a page that will overload your database and bring your server to its knees. The real lesson here is that you should NEVER execute any SQL based on user supplied data without vetting it first. The secondary lesson is that you should (if you can) limit requests to your server from search engines via your robots.txt file (from here):
If your web application does have issues with handling occasional requests (for example on request per second), you can slow down Bing and Yahoo with the following entry in robots.txt:
This will ask crawlers to wait at least 120 seconds between requests. For Google, you can define the delay in the webmaster tools.