I'm sure we've all seen our servers get scanned repeatedly for vulnerabilities in other systems. A quick check through the error logs show some obvious examples of this, including requests for: /_vti_bin /scripts /MSADC /MSOFFICE Etc, etc. Almost inevitably, these requests come in bursts, typically from the same IP. All of these calls are currently getting the customary 404, but I wonder if there's anything more intelligent or proactive to be done. I've thought about building myself a hosts-deny kind of solution using external methods, but I'm not sure that's necessarily going to save me very many cycles in the long run. Has anybody thought of a better way to handle this kind of stuff? TIA, Dylan
Dylan> I'm sure we've all seen our servers get scanned repeatedly for Dylan> vulnerabilities in other systems.... Dylan> All of these calls are currently getting the customary 404, but I Dylan> wonder if there's anything more intelligent or proactive to be Dylan> done. You might be able to slow them down. Depending what sort of control you have over the HTTP bits stuffed on the wire, when you encounter requests for such pages, you can have the thread serving the connection slow its responses to a crawl, issue "100 Continue" responses, etc. In the mail spam world this is generally called "teergrubing". The challenge of identifying suspect clients is easier with HTTP than with SMTP. HTTP clients have to ask for a page you recognize as clearly a scan for holes in your system. In SMTP you have to infer the other end is a bad guy based upon the remote IP address. It should be fairly easy to extend the httplib module to crawl when asked. -- Skip Montanaro - skip@pobox.com http://www.mojam.com/ http://www.musi-cal.com/
Skip> It should be fairly easy to extend the httplib module to crawl Skip> when asked. Sorry for the braino. That should have been SimpleHTTPServer or similar module, not httplib. Skip
Skip Montanaro wrote:
You might be able to slow them down. Depending what sort of control you have over the HTTP bits stuffed on the wire, when you encounter requests for such pages, you can have the thread serving the connection slow its responses to a crawl, issue "100 Continue" responses, etc.
Isn't this the same as a DOS attack on yur own server, though? cheers, Chris
>> You might be able to slow them down. Depending what sort of control >> you have over the HTTP bits stuffed on the wire, when you encounter >> requests for such pages, you can have the thread serving the >> connection slow its responses to a crawl, issue "100 Continue" >> responses, etc. Chris> Isn't this the same as a DOS attack on yur own server, though? Not if you have a multi-threaded server. Legitimate requests will be handled by new threads. Legitimate 404's (stuff not on your list of "obvious scan attempts") will get 404'd immediately. You might pile up a few sleepy threads, but all-in-all the load on your server should be quite modest. The only problem you might encounter would be if your server got blasted by large numbers of such requests in a very short period of time. To avoid this problem you could cap the number of "sluggish" responses at some figure, after which you simply fall back to regular 404 responses. Skip
Skip Montanaro wrote:
>> You might be able to slow them down. Depending what sort of control >> you have over the HTTP bits stuffed on the wire, when you encounter >> requests for such pages, you can have the thread serving the >> connection slow its responses to a crawl, issue "100 Continue" >> responses, etc.
Chris> Isn't this the same as a DOS attack on yur own server, though?
To avoid this problem you could cap the number of "sluggish" responses at some figure, after which you simply fall back to regular 404 responses.
All sounds cool, lot of work though ;-) cheers, Chris
>> To avoid this problem you could cap the number of "sluggish" >> responses at some figure, after which you simply fall back to regular >> 404 responses. Chris> All sounds cool, lot of work though ;-) Not really, though of course it depends on how motivated you are to solve the problem. ;-) You need a 404 handler which checks to see if the start of the requested path is on the no-no list. When the handler is called, it first checks the number of running threads. If the max has been reached or exceeded, shoot back a 404 and return. Otherwise, increment the running threads counter, snooze for awhile, then redirect to the next path in the chain. If you're fronting Zope with Apache or Squid I suspect it would be worth checking to see if they already implement something similar. You could easily do something with mod_rewrite, though I'm not too sure about the thread counter business. You'd probably just bump up against the maximum number of httpd processes (in which case you _would_ have a DOS attack). Skip
participants (3)
-
Chris Withers -
Dylan Reinhardt -
Skip Montanaro