What is the best way to prevent and block the scans? (w00tw00t, tmUnblock.cgi etc)

September 25, 2014 10.9k views

Hi, I have often these lines in /var/log/apache2/access.log - - [25/Sep/2014:12:25:46 +0200] "POST /%70%68%70%70%61%74%68/%70%68%70?%2D%64+%61%6C%6C%6F%77%5F%75%72%6C%5F%69%6E%63%6C%75%64%65%3D%6F%6E+%2D%64+%73%61%66%65%5F%6D%6F%64%65%3D%6F%66%66+%2D%64+%73%75%68%6F%73%69%6E%2E%73%69%6D%75%6C%61%74%69%6F%6E%3D%6F%6E+%2D%64+%64%69%73%61%62%6C%65%5F%66%75%6E%63%74%69%6F%6E%73%3D%22%22+%2D%64+%6F%70%65%6E%5F%62%61%73%65%64%69%72%3D%6E%6F%6E%65+%2D%64+%61%75%74%6F%5F%70%72%65%70%65%6E%64%5F%66%69%6C%65%3D%70%68%70%3A%2F%2F%69%6E%70%75%74+%2D%6E HTTP/1.1" 404 393 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +" - - [25/Sep/2014:12:30:57 +0200] "GET / HTTP/1.1" 404 431 "-" "ZmEu" - - [25/Sep/2014:12:30:58 +0200] "GET /phpMyAdmin/scripts/setup.php HTTP/1.1" 404 421 "-" "ZmEu" - - [25/Sep/2014:12:30:59 +0200] "GET /phpmyadmin/scripts/setup.php HTTP/1.1" 401 580 "-" "ZmEu" - - [25/Sep/2014:12:31:00 +0200] "GET /pma/scripts/setup.php HTTP/1.1" 404 415 "-" "ZmEu" - - [25/Sep/2014:12:31:00 +0200] "GET /myadmin/scripts/setup.php HTTP/1.1" 404 419 "-" "ZmEu" - - [25/Sep/2014:12:31:05 +0200] "GET /MyAdmin/scripts/setup.php HTTP/1.1" 404 419 "-" "ZmEu" - - [25/Sep/2014:14:53:55 +0200] "GET /tmUnblock.cgi HTTP/1.1" 400 431 "-" "-" - - [25/Sep/2014:15:23:54 +0200] "GET /cgi-sys/defaultwebpage.cgi HTTP/1.0" 404 427 "-" "() { :;}; /bin/ping -c 1"

What is the best way to prevent this and block the scans?

Thank you very much in advance

3 Answers

I'd suggest fail2ban. It's a service you can configure to monitor your log files for attacks and automatically ban them after a certain threshold. There are a number of packaged "jails" for Apache that you can turn on right away for things like too many failed authorizations, bad bot user-agent info, and attempts to access scripts,

I have a lot of sites that are just hobby level and don't really care about security on those. I do work for a company though, that has critical information. For that site, we have a 404 handler that logs all requests and sends me an email about them. Yes, I get bombarded by 404 email's on a daily basis, but it's my job to keep the site secure, among other things. One thing that could be done is to auto-block IP's after you get too many 404's. There are utilities that do this already, but the script is not real difficult to make.

For my personal web server (nginx), and since it's not a public website or some such, I used a GeoIP module to block all requests coming from China (IP range database provided by MaxMind). But after that I decided to GeoIP block every incoming traffic at firewall level from specific countries regardless of the communication protocol (SSH, HTTP...) , and both method helped a lot to get rid of the listed HTTP scans.

If it's not desirable for some reason, to block whole countries, so try to find some way to "auto-generate" iptables DROP rule(s) for the requesting client IP(s), whenever such a scan takes place.


  • - Read /var/log/apache2/access.log periodically and search lines containing "/phpmyadmin/scripts/setup.php".
  • - Extract potential IP(s) if found and write it/them to file.
  • - Check whether found IP(s) already banned (e.g. through iptables)
  • - Block incoming requests from found IP(s): /sbin/iptables -I INPUT -s {IP-HERE} -j DROP

Maybe it's not an efficient way, but over the time you will generate an own "bad bot ip database", as I couldn't find till now any ready-to-use IP range databases for known worldwide bad bots.

Hope it helps.

Have another answer? Share your knowledge.