Load Balancing for Custom PHP App with heavy traffic

March 3, 2018 139 views
Load Balancing Debian

I'm using Digital Ocean server with 48 GB of RAM, and 12 Core Processors for my custom php web app. Now, the daily pageview is up to 500000 and my server can't serve the app and the website is very slow loading speed. CPU usage is too much due to heavy traffic. So, I'm finding the solution to solve the problem. One of my friend suggests to use Load Balancing for my app. I'm not familiar with load balancing method and now I find the solution for my problem in this discussion .

My app is web based Facebook Quiz using PHP and the main function of my app is to generate images for user. Now, the server can't handle the heavy traffic and slows in generating image results.

To set load balancing for my php web app, what is the minimum requirement for server ?

Now, I'm using Debian OS and daily pageview is around 500000 . I'm also use Cloudflare CDN service. Please kindly help me to solve my problem . Thanks for all.

1 Answer
moisey MOD March 14, 2018
Accepted Answer

There are two things that you should do. Technically yes you can load balance, which means you would need to create an image of your existing server and then create a second one and spin up one of our load balancers to redirect traffic to both droplets.

Besides that you should also look at the output of


You want to see which processes are taking up the most amount of resources.

The reason for this is you want to start separating out different processes onto dedicated droplets.

For example if 20% of your load was coming from say MySQL you would first want to move MySQL onto a dedicated droplet, that would reduce the load on your main app/web server by 20% and now you have more breathing room on your database droplet.

You also mentioned that one of the functions of your app is to generate images, that maybe a process that consumes a lot of CPU, so you may want to export that functionality to a different droplet, and then have your app/web server call that process on another server through an API so that you can distribute load further.

Once you have processes broken out and separated, you can easier setup load balancing because now instead of having one server doing everything you have specialized servers performing functions and you are increasing resources accordingly.

Also, if you have a database you can't load balance it, what you need to do if you are running out of resources there is to setup master <-> slave replication.

Stateless systems are the easiest to load balance directly, while other services have their own built in mechanisms for load balancing or distributing load that need to be used.

Have another answer? Share your knowledge.