Best practice for web2py/ngnix/postgresql python deployment using Docker... or not?
About to start on a brand new site, so I can start with a clean slate. That’s good. I’m back to Python after an 8 year absence, which is bad. The site will be built using web2py, which requires no special configuration. Back end will be postsgresql, web server will be ngnix. Am trying to understand best practices. I want it to be:
- Easy to replicate on another ISP, esp. if (heaven forfend) DO doesn’t work out or if (far more likely) the site is attacked
- Able to handle a staging version, a testing version, and a production version
- Easy to scale up when I get customers (note: it’s a multitenant app)
- Easy to hand off to someone once I finish the 1.0 version
- Properly version controlled
I think the right approach is as follows, but please correct me if I’m wrong.
- Use Docker to install components such as python, nginx, postgresql, and web2py
- I imagine pip commands get executed using Docker RUN commands?
- Use git for version control and management of libs such as jQuery and jactive.js
- How does one handle staging vs testing vs production versions? Separate Docker containers? Then use git to update code on production to staging, for example?
- How can I tell programmatically if Docker failed to install something?
- How do I automate the process of creating a new droplet and docker container within that droplet? Shell scripts using ssh?
Thanks much. Sorry for the treatise-length question but I’d like to start out with good habits and not harm the person who follows me.