How to prevent bad caching on Users Browsers with Virtual Hosts on NGINX

Hello DO fellas, I am going to narrate one very bad error i commited last week, on my NGINX server so nobody has to live the same experience and at the same time, ask for your kindly help on the matter. So you can guide me on the best way to fix and prevent this in the future.

Around 8 months ago i was a frustrated Godaddy costumer. I had around 50 of my costumer´s websites with them. To manage this websites i didnt use the terminal, i used CPANEL. And that was it.

That was because 8 months ago I did not knew anything at all about DigitalOcean. And i didnt knew not a single thing about UBUNTU, NGINX, SSL, UFW, VHOSTS, SERVER BLOCKS ADMINER, and so on and so forth. And even that i cannot state now that i am a master on these and related matters, i feel pretty comfortable (and very happy) using them. This change from being a Godaddy Cpanel user to manage my costumers websites to the DigitalOcean paradigm has been for me an eye opener like almost nothing else. (the other eye opener was when i discovered the benefits of using git with my team). (please take into account i live in an under-develped country). So thanks to all the guys and girls here in the community who write such great tutorials, specially Justin E, who takes the time to explain things clearly. You are spreading beautiful and useful knowledge and thus helping lots and lots of people to improve their workflows and lives.

So now, 8 months later i am a very happy DigitalOcean user with 4 droplets. One of them is hosting the 50 websites i used to have at Godaddy. This particular droplet uses Nginx on ubuntu 16, PHP7 and MYSQL. It is a 20 dollar a month droplet. So it has 2 CPU. This droplet receives around 500-1000 visits a day (on all the 50 websites combined). I have installed HTOP to monitor (cause basically im sitting all day in front of my computer). And ocasionally i watch the DigitalOcean Graphics on the droplet to make sure everything runs smoothly. My websites serve a lot faster that when they were hosted at Godaddy.

Now to the quid of the matter. A few days ago i was migrating my last website from Godaddy to DO (on Godaddy they dont let you choose to have PHP 7, or Nginx, so sometimes i had to tweak and upgrade some details before bringing a website to my fancy DO droplet). Since i learned from Justin E. how to Set Up Nginx Server Blocks (Virtual Hosts) on Ubuntu, i have 50 server block files on my /etc/nginx/sites-available folder. One separete server block for each website.

My error was this: Since i was migrating the website from Godaddy to My droplet i pointed the A record of this website to my droplet’s ip, BEFORE, i had enabled the VHOST (the server block for that website). So in those 5 minutes that the A record of that domain was pointing to my droplet without an Virtual Host enabled, every user of that website that requested the domain via some brower ended up on MY WEBSITE insted (the one website of all the 50 who had the DEFAULT SERVER directive enabled ton the server block).

So now even if i have succesfully enabled the server block for that website, ALL the users who visited the domain have somehow on their browers cache a redirection to my website that DOES NOT dissapears even if they delete their browser cache.

So now the only way i can make their browsers to correctly visit the appropiate domain (and not being redirected to my website) is by adding index.php at the end of the domain address. Now that is an esoteric behavior (a behavior i dont fully understand). I thought those sort of behaviors were fixed when you delete the navigators cache but its not. So everyone else who HAD NOT visited the domain on those 5 minutes, does not have a problem, but all the others have one. They are being constantly redirected to my domain.

I thought a way of avoiding this was to remove the default domain directive on the server block of MY domain, but if i do that, and for example disable one of the other domains, and then someone visits that domain name while is disabled on the server, then they are redirected to the first of the virtual host (the first on the list). I whish i could find a way that if the domain is disable, they just get a 404 or something like that but certainly i dont want browsers to somehow “store” redirections from one domain to another.

Can some one throw light on these matters please? Thanks in advanced.

Submit an answer

This textbox defaults to using Markdown to format your answer.

You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!

Sign In or Sign Up to Answer

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.

Want to learn more? Join the DigitalOcean Community!

Join our DigitalOcean community of over a million developers for free! Get help and share knowledge in Q&A, subscribe to topics of interest, and get courses and tools that will help you grow as a developer and scale your project or business.

This is an interesting issue. The best recommendation would be to set up your server block (virtualhost) before pointing your DNS records to your droplet.

Another option would be to create a new default server block that hosts a static HTML file. You can use meta tags on that static page to prevent caching.

From what you’ve described, my best guess is that your previous default site had a redirect set up so even if content was not cached a redirect, especially a 301 redirect could still be cached locally. If you’re running WordPress on that default site, it may use a redirect to ensure the address matches the one it’s configured with.