Checked into this a bit at FutureQuest. There's not a whole hell of a lot that I can do about this problem. The software controls are for all accounts on FutureQuest, and they're not going to change them. They are designed to keep spidering (ro-)bots from overwhelming the servers. These spidering programs are sent out by sites like Google and Microsoft to help them compile what's out there on the web, for use in their search engines. The bots are apparently poorly written software when it comes to balancing the load on a server. The bots can overwhelm a network server with requests, basically shutting down a site and in my case, over-running my site's bandwidth allowance.
A few months ago, the MSbot used 3x my site's bandwidth allowance, leaving me with a extra $100 to pay to FQ so that the crappy MS-bot could run amok on Gunks.com. It took me a while but I finally found a human at MS to send an email, asking them please to stop. I eventually just blocked the MSbot from the site, so a gunks.com search on MS probably yield a lot less than Google.
But let me ask you all. What is the big inconvenience of having to wait 2.0 seconds to make your next request? Is this really a problem? Does the delay get beyond 2 seconds?