I have a 512MB droplet that works great, and am quite happy with DigitalOcean.
Recently, I noticed that the droplet will cache my out-going http requesets. For example, I used
wget http://some.dynamic.website.com to fetch its
index.html. I got exactly the same file every time (even the linux file meta data was the same), even though the content of the page had actually changed which was verified by visiting it directly from my local web browser. I tried
curl instead, I got a new file, but still filled with old content.
Apparently, the droplet had cached the out-going http requests (as well as generated files by a particular program?). This will cut down physical network traffic significantly, thus benifit DigitalOcean, and in most cases, the user as well. However, if I know the target webpage updates frequently and mean to get the latest content, I should be able the do so.
As a reference, I created a new droplet, which turned out behaving differently: it gave me the latest content very time. While for my a month old working droplet, the content updated once or twice per day. It seems that DigitalOcean relies on some internal statistics to determine whether and how much its caching machinary is used.
So my questions are:
This textbox defaults to using Markdown to format your answer.
You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!
These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.
Join our DigitalOcean community of over a million developers for free! Get help and share knowledge in Q&A, subscribe to topics of interest, and get courses and tools that will help you grow as a developer and scale your project or business.
Click below to sign up and get $100 of credit to try our products over 60 days!