How to manage 2000 simultanious users?, My mySQL bottlenecks my site.

I run a website that manages user signups for specific dates. When the sign ups open there are somewhere near 2000 users on my site and it has previously crashed my site. The code is simple and easy, but with so many people hitting the server and sending and receiving data from the database at once, it creates a bottleneck and crashes my site. Anyone have suggestions on how to resolve this issue? This only happens around 3 times a month. so I would only need the boost a few days out of a given month. Thanks for the input!

Also, I want to mention that my Apache and mySQL are on the same server. Not sure if this is also a cause of the crash.

Submit an answer

This textbox defaults to using Markdown to format your answer.

You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!

Sign In or Sign Up to Answer

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.

Want to learn more? Join the DigitalOcean Community!

Join our DigitalOcean community of over a million developers for free! Get help and share knowledge in Q&A, subscribe to topics of interest, and get courses and tools that will help you grow as a developer and scale your project or business.


From looking at the defaults for a stock MySQL installation, max_connections is set to 151, so if at any time, there are more than 151 active connections to MySQL, you’ll receive that error.

Enabling query cache will help with select statements (i.e. reads) as it caches the statement as well as the results. For example, if you had a broad select statement such as:

select * from my_table

And that statement normally took 2.0 seconds to execute, with query cache enabled, it may only take a fraction of that, or 0.0 seconds next time around.

That being said, enabling query cache most likely won’t resolve your issue on its own. It will speed up frequent reads to your table(s) when someone checks if a date is available, but it won’t do anything when users are physically adding data to your database.

Back to max_connections, you can raise this limit by modifying the MySQL configuration, though as a general note, the stock MySQL configuration isn’t really meant for production use – it’s meant to be a starting point for you to work with.

The more connections you allow, the more RAM MySQL is going to consume, so if you raise it too high you may very well run in to an issue of MySQL crashing due to OOM errors instead of maxing out the connection limit.

It’s a balancing act of sorts and there’s a lot of variables at play – some you need to pay more attention to than others, while some you need to just leave alone unless you have reason to modify them.

With that, what size is your current Droplet (i.e. RAM)?

I’d also recommend looking at MySQL Tuner. It’ll help you get a general idea of what’s going on and it’ll also make a few suggestions as to what you may need to tweak.

To install MySQL Tuner, choose a directory such as /opt or perhaps /usr/local/src and run:

wget -O \
&& wget -O basic_passwords.txt \
&& wget -O vulnerabilities.csv \
&& perl

Here’s the project:

Before running, ideally, MySQL should be running for at least 24-48 hours so the script has some data to work with.

If you have 2000 users on site it does not mean 2000 simultanious connections to mysql, actually this value should be much less. And MySQL has enough capabillities to handle such load. How exactly your site has been crashed? Is this a server software crashing or MySQL? Can you give exact error message to be more specific?

If the problem is on MySQL side indeed, take a look at two things. MySQL comes with several examples of config file for different levels of load. Make sure you take settings from the one designed to high load. Also turn on the query cache, it may help to increase performance of fetching data from database.