How to prevent my CPU from hitting 100%

I have a droplet running Ubuntu to support a small Rails app and a few weeks ago, my handful of users reported that they couldn’t get on, and I discovered that there was some bot indexing every page on my site, hitting several pages per second and using up all of my CPU capacity. This now seems to happen once per week. I am not a highly skilled web master - somebody set up the droplet for me and I just know how to write the Rails code and work with the database. Is there anything I can do on the server to, say, limit how many pages one user can hit at a time, or anything else that would prevent this from happening?

Submit an answer
Answer a question...

This textbox defaults to using Markdown to format your answer.

You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!

Sign In or Sign Up to Answer

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.

Site Moderator
Site Moderator badge
September 9, 2022
Accepted Answer

Hi @gebelo,

What you can do is create a robots.txt file in your website’s root directory, disallow all bots except Google to craw your website.

The contents of the robots.txt file can be:

  1. User-agent: Googlebot
  2. Allow: /
  3. User-agent: *
  4. Disallow: /
Site Moderator
Site Moderator badge
September 12, 2022

Hello @gebelo

If the robot’s crawling is causing the CPU spikes, then follow KFS’s suggestion. This should do the trick for you.

I will also recommend you monitor the droplet’s resource usage using top or netstat. You can find more information here.