Guys Need Some Help!
what would you recommand for a high traffic wordpress website
decided to go with highest plan in DO. need to face 25k real time traffic. (i mean 25k users will be, online in website)
i thought to go with this
nginx + memcached + varnish + w3totalcache
any suggestions…
These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.
My suggestion: Offload the heavy-weight traffic to a CDN. It will definitely solve some of your problems and you don’t need to worry about traffic spikes (I don’t have good experience with traffic spikes on droplets).
I can recommend KeyCDN or compare some CDNs at CDNCost
@André Pereira da Silva, can you please share your MySQL query caching configurations please… :)
I use nginx + fastcgi_cache + php+fpm + Xcache and MySQL query caching configurations.
hmm :) thanks a lot mate :) need to experiment some thing :)
4 clusters. <br>1 in AMS2; <br>1 in SGP1 and; <br>2 in NY2. <br> <br>Each cluster contains (upgrade as needed): <br>1 Caching load balancer (2 cores, 4GB of RAM) <br>2 web servers (2 cores, 2GB of RAM each) <br>1 MySQL server (2 cores, 2GB of RAM) <br> <br>Use private networking to let the droplets communicate with other droplets in the cluster. <br> <br>Decide on a cluster that you want to push edits to and use rsync to keep the other clusters synced and are mirrors of eachother. <br> <br>For DNS, use something like Dyn or Route53 to route traffic to the closest cluster and enable failover if a cluster goes offline.
@dre <br> <br>bro any recommended tutorial for that installation please… <br> <br>and have you changed any settings/config in ngnix or all defaults ?
I use nginx + fastcgi_cache + php+fpm with zend opcache + memcached + w3totalcache (only for object cache and database cache).
@to.mars.and.back.on.a.turtle, <br> <br>thanks for the quick reply mate :) <br> <br>not familiar with these things :( <br> <br>any detailed tutorial would be much appreciated… :(
Spread the load over more droplets. <br> <br>For nginx: <br> location / { <br> try_files $uri $uri/ /index.php?args; <br> expires max; <br> } <br>