Report this

What is the reason for this report?

Solution to maintain a web app (PHP) deployed in different droplets

Posted on June 10, 2019

Hi,

I’m developing a PHP web app that, because of business model reasons, we’re thinking to initially deploy it to each customer along with an own virtual machine (droplet).

Haven’t used docker or kubernetes before, but I understand it could help with this, but before I dig deeper into that I would like to know if it’s indeed the right path to follow.

After the app is deployed, we’ll be doing updates to the app, so we’re seeking to just “push them” to each customer’s droplet with ease.

I would like to ask the community for opinions on this and, if possible, a rough list of steps to follow.

Many thanks



This textbox defaults to using Markdown to format your answer.

You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.

This comment has been deleted

Hi there,

A few clarifying questions about the service you are providing that may help guiding your decision.

Does your frontend scale well to many instances? If applicable, can your backend scale well to many instances? Is your application tightly coupled to any other services? Can you application easily be broken down into logical components? Are you having issues with your current design right now?

Other things to consider to get an idea of what the application may look like: Does the webapp frontend have differences per client? Do the updates you’re pushing go out to each client or sometimes only to a few? Do you need storage for some sort of backend? Do the clients each need separate backends or can they share a single one?

While kubernetes and containerization can be beneficial in a lot of ways, sometimes for very simple use cases it could certainly be unnecessary legwork to containerize applications and learn kubernetes and surrounding technologies for little actual benefit.

That being said containerization does help in easing the deployment and portability of the application in question(in most cases).

It may largely come down to what your biggest priorities for this application are, that may steer you in one direction or the other.

Regards,

John Kwiatkoski Senior Developer Support Engineer

Deploying individual instances of a web application per customer, often referred to as a multi-tenant architecture where each tenant has their isolated environment, can indeed benefit significantly from containerization and orchestration technologies like Docker and Kubernetes. Here’s a breakdown of how these technologies could be useful for your scenario and a suggested path forward:

Benefits of Using Docker and Kubernetes

  1. Consistency Across Environments: Docker containers package your application and all its dependencies into a single unit, ensuring that it runs the same way in every environment.

  2. Scalability: Kubernetes excels in managing and scaling applications dynamically according to the load and requirements without manual intervention.

  3. Isolation: Each customer’s application instance can run within its own set of containers, ensuring security and isolation.

  4. Simplified Updates: Rolling updates, rollbacks, and canary deployments are streamlined in Kubernetes, allowing you to update customer applications with minimal downtime.

  5. Resource Efficiency: Containers share the host system’s OS kernel and are lighter than traditional virtual machines, which means you can run more containers on a given hardware than VMs.

Suggested Steps to Get Started

Step 1: Containerize Your Application

  • Develop a Dockerfile: Create a Dockerfile for your PHP application. This file will include instructions on how to build the Docker image of your application, including the base image, dependencies, and deployment commands.
  • Build and Test Locally: Build your Docker image locally and test to ensure everything runs correctly inside the container.

Step 2: Prepare the Deployment Infrastructure

  • Set Up Kubernetes Cluster: You can set up a Kubernetes cluster per customer or a single cluster with strong isolation practices (e.g., using namespaces). Services like Amazon EKS, Google GKE, or self-managed clusters on VMs can be used.
  • Configure Networking: Set up ingress controllers and service meshes as needed for routing traffic and managing services within your Kubernetes cluster.

Step 3: Automate Deployment

  • Continuous Integration / Continuous Deployment (CI/CD): Implement CI/CD pipelines using tools like Jenkins, GitLab, or GitHub Actions. This setup will handle your testing and deployment automatically every time you make changes to your codebase.
  • Deployment Scripts: Create Kubernetes deployment configurations using YAML files. These scripts define how your application should be deployed, the resources it requires, and how to handle updates.

Step 4: Manage Data Persistence

  • Databases and Storage: Decide how you’ll handle persistent storage in Kubernetes, which could include using external databases per customer or persistent volumes within the cluster.

Step 5: Implement Monitoring and Logging

  • Monitoring Tools: Integrate monitoring tools like Prometheus and Grafana to keep an eye on your application’s performance and health.
  • Logging: Set up centralized logging with tools like ELK Stack (Elasticsearch, Logstash, and Kibana) or Loki to aggregate and manage logs efficiently.

Step 6: Testing and Rollout

  • Test Thoroughly: Before going live, thoroughly test the entire deployment in a staging environment.
  • Phased Rollout: Start deploying for a small number of customers and gradually extend to more, utilizing the feedback to improve the setup.

Moving Forward

Start by learning Docker basics, then experiment with deploying a small, non-critical application using Kubernetes to understand how these technologies fit together. Online courses, official documentation, and community forums are invaluable resources as you scale up your knowledge and implementation.

Deploying and managing applications at this level can be complex, but the flexibility, scalability, and robustness of using Docker and Kubernetes often justify the initial learning curve and setup efforts.

The developer cloud

Scale up as you grow — whether you're running one virtual machine or ten thousand.

Get started for free

Sign up and get $200 in credit for your first 60 days with DigitalOcean.*

*This promotional offer applies to new accounts only.