Hello,
Someone is trying to search folders on my site. For example

https://site.com/abc.php
https://site.com/def.php

How can I disable these kind of bots from my website? If someone attempts more than five 404 error page, then I want to block it automatically.

I am using centos7. Thanks

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.

×
1 answer

Hi @dtorgul,

This is something you’ll need to disable on WebService level like Apache/Nginx rather than on your OS level. If you are using Apache, then here are two ways to block bots trying to access your site. One is through the robots.txt file, and the other is through the .htaccess file.

First of all, a word of warning. Be very careful when you’re blocking bots through the .htaccess file. One typo and you can end up blocking the entire Internet. Obviously you don’t want that.

The first thing you want to do is back up your current .htaccess file. In the case of an error that blocks traffic you don’t want blocked, you can restore the old file to revert the changes until you can figure out what went wrong.

The second thing you want to do is figure out how to find your own access logs. In your case, they should be in te folder /var/log/apache2. If they aren’t there, you can try and find them in /etc/apache2/logs.

The log file will have data on all of your regular users, and all of your bot access. Some bots, like the Google bots, will identify themselves through their user agent information. Bad bots sometimes identify themselves, but often just have certain characteristics that flag them as non-human.

In there you can use the user-agents of the bots and actually block them in your .htaccess.

Regards,
KDSys

Submit an Answer