Best practice for web2py/ngnix/postgresql python deployment using Docker... or not?

December 18, 2014 2.1k views

About to start on a brand new site, so I can start with a clean slate. That's good. I'm back to Python after an 8 year absence, which is bad. The site will be built using web2py, which requires no special configuration. Back end will be postsgresql, web server will be ngnix. Am trying to understand best practices. I want it to be:

  • Easy to replicate on another ISP, esp. if (heaven forfend) DO doesn't work out or if (far more likely) the site is attacked
  • Able to handle a staging version, a testing version, and a production version
  • Easy to scale up when I get customers (note: it's a multitenant app)
  • Easy to hand off to someone once I finish the 1.0 version
  • Properly version controlled

I think the right approach is as follows, but please correct me if I'm wrong.

  • Use Docker to install components such as python, nginx, postgresql, and web2py
  • I imagine pip commands get executed using Docker RUN commands?
  • Use git for version control and management of libs such as jQuery and jactive.js


  • How does one handle staging vs testing vs production versions? Separate Docker containers? Then use git to update code on production to staging, for example?
  • How can I tell programmatically if Docker failed to install something?
  • How do I automate the process of creating a new droplet and docker container within that droplet? Shell scripts using ssh?

Thanks much. Sorry for the treatise-length question but I'd like to start out with good habits and not harm the person who follows me.

2 Answers

This sounds like a great approach. Besides isolating processes, one of the big wins with Docker is that it makes your application portable. One thing to note, you should really think of each container as providing a separate service. For instance, don't install postgres and your web2py in the same containers. Use a separate one for each. This will also help with future scalability concerns.

As for telling programmatically if Docker failed to install something, building the image will error out in that case. Take the Dockerfile:

FROM ubuntu

RUN apt-get install -y python python-pip
RUN pip install a-package-that-doesnt-exist

You won't be able to produce an image as the RUN command will return a returned a non-zero exit code:

Step 3 : RUN pip install a-package-that-doesnt-exist
 ---> Running in 286595d506a5
Downloading/unpacking a-package-that-doesnt-exist
  Could not find any downloads that satisfy the requirement a-package-that-doesnt-exist
Cleaning up...
No distributions at all found for a-package-that-doesnt-exist
Storing debug log for failure in /.pip/pip.log
2014/12/19 14:44:47 The command [/bin/sh -c pip install a-package-that-doesnt-exist] returned a non-zero code: 1

For automating deployment, there are many different options that you could use. As you're already working in Python, Fabric might be a good choice. It allows you to do tasks like upload files and run remote commands. Checkout this article for more information:

If you wanted to incorporate creating droplets into the same scripts, you could use our API via python-digitalocean.

Love to hear other's thoughts, and what you end finalizing on!

by O.S. Tezer
In this DigitalOcean article, the system administration and application deployment streamlining library "Fabric" is our subject. We will learn how to install this wonderful tool, as well as see how easy things can become by simply automating mundane management tasks that would otherwise require jumping through hoops with bash hacks and hard-to-maintain, complex scripts.

Thanks, for your perspective, asb. I am having trouble getting my arms around the concept of how postgres and Python can talk to each other in separate containers. I now see your suggestion is the "right" one. Just trying to come to terms with it.

Have another answer? Share your knowledge.