This is something you’ll need to disable on WebService level like Apache/Nginx rather than on your OS level. If you are using Apache, then here are two ways to block bots trying to access your site. One is through the robots.txt file, and the other is through the .htaccess file.
First of all, a word of warning. Be very careful when you’re blocking bots through the .htaccess file. One typo and you can end up blocking the entire Internet. Obviously you don’t want that.
The first thing you want to do is back up your current .htaccess file. In the case of an error that blocks traffic you don’t want blocked, you can restore the old file to revert the changes until you can figure out what went wrong.
The second thing you want to do is figure out how to find your own access logs. In your case, they should be in te folder
/var/log/apache2. If they aren’t there, you can try and find them in
The log file will have data on all of your regular users, and all of your bot access. Some bots, like the Google bots, will identify themselves through their user agent information. Bad bots sometimes identify themselves, but often just have certain characteristics that flag them as non-human.
In there you can use the user-agents of the bots and actually block them in your .htaccess.