There are 2 types of web application that I have.

Type A

  • Application Server => 2vCPUs Shared ,Memory 4GB, $20/m
    • Database Server =>2vCPUs Dedicated, 8GB RAM, 25GB Disk, $115/m

Type B

  • Application Server => 1vCPUs Shared, Memory 2GB , $10/m
  • Database Server =>1vCPUs Shared, 1GB RAM, 10GB Disk, $15/m

Both types use Ubuntu, php, MySQL.

As you can see, the type A is higher spec.

First I created the Type B, and created the Type A based on the snapshot of the Type B.
I uploaded the same SQL database, and same php source code.

I run a heavy process with the exactly same condition at both types.

I believed that Type A complete the task much faster, but the result was as follows.

Type A : 21 minutes 40 seconds
Type B : 12 minutes 25 seconds

is there anyone who can explain me the reason for the result well ?

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.

Submit an Answer
1 answer

Hi there,

Indeed at a first glance, this does not really make sense. This could be explained if the server configuration is different, for example, if setup B is better tweaked in terms of PHP and MySQL settings and etc.

What I could suggest in this case is to do the following:

  • Add some logging in your application to verify that the two environments are actually processing the same data. As if there is a lot less data to be processed on one of the systems it would explain why it is faster.

  • Monitor the resource utilization on the two environments to verify that the resources are actually fully utilized and not sitting idle.

  • Check the error logs to make sure that there are no errors that could be causing the delay.

Let me know how it goes.