How to manage 2000 simultanious users?, My mySQL bottlenecks my site.

April 21, 2017 247 views
Apache MySQL PHP PHP Frameworks

I run a website that manages user signups for specific dates. When the sign ups open there are somewhere near 2000 users on my site and it has previously crashed my site. The code is simple and easy, but with so many people hitting the server and sending and receiving data from the database at once, it creates a bottleneck and crashes my site. Anyone have suggestions on how to resolve this issue? This only happens around 3 times a month. so I would only need the boost a few days out of a given month. Thanks for the input!

Also, I want to mention that my Apache and mySQL are on the same server. Not sure if this is also a cause of the crash.

2 Answers

If you have 2000 users on site it does not mean 2000 simultanious connections to mysql, actually this value should be much less. And MySQL has enough capabillities to handle such load. How exactly your site has been crashed? Is this a server software crashing or MySQL? Can you give exact error message to be more specific?

If the problem is on MySQL side indeed, take a look at two things. MySQL comes with several examples of config file for different levels of load. Make sure you take settings from the one designed to high load. Also turn on the query cache, it may help to increase performance of fetching data from database.

  • Thanks for the quick response! So the first error I got was a 503 error and they on reloading i would also get maxconnectionlimits errors as well. I don't feel like its a code issue because its pretty simple in that all it does it query the available dates, checks if the user is signed up for the date, counts the users signed up, and adds dates to the users profile. Ill try to turn on query cache. Thanks!

@derekleinbach

From looking at the defaults for a stock MySQL installation, max_connections is set to 151, so if at any time, there are more than 151 active connections to MySQL, you'll receive that error.

Enabling query cache will help with select statements (i.e. reads) as it caches the statement as well as the results. For example, if you had a broad select statement such as:

select * from my_table

And that statement normally took 2.0 seconds to execute, with query cache enabled, it may only take a fraction of that, or 0.0 seconds next time around.

That being said, enabling query cache most likely won't resolve your issue on its own. It will speed up frequent reads to your table(s) when someone checks if a date is available, but it won't do anything when users are physically adding data to your database.

...

Back to max_connections, you can raise this limit by modifying the MySQL configuration, though as a general note, the stock MySQL configuration isn't really meant for production use -- it's meant to be a starting point for you to work with.

The more connections you allow, the more RAM MySQL is going to consume, so if you raise it too high you may very well run in to an issue of MySQL crashing due to OOM errors instead of maxing out the connection limit.

It's a balancing act of sorts and there's a lot of variables at play -- some you need to pay more attention to than others, while some you need to just leave alone unless you have reason to modify them.

...

With that, what size is your current Droplet (i.e. RAM)?

I'd also recommend looking at MySQL Tuner. It'll help you get a general idea of what's going on and it'll also make a few suggestions as to what you may need to tweak.

To install MySQL Tuner, choose a directory such as /opt or perhaps /usr/local/src and run:

wget http://mysqltuner.pl/ -O mysqltuner.pl \
&& wget https://raw.githubusercontent.com/major/MySQLTuner-perl/master/basic_passwords.txt -O basic_passwords.txt \
&& wget https://raw.githubusercontent.com/major/MySQLTuner-perl/master/vulnerabilities.csv -O vulnerabilities.csv \
&& perl mysqltuner.pl

Here's the project: https://github.com/major/MySQLTuner-perl

Before running, ideally, MySQL should be running for at least 24-48 hours so the script has some data to work with.

Have another answer? Share your knowledge.