I have a Django Application running with docker on a droplet.
I have a Redis and an Elastic search service running from docker which I obviously would like to keep the data from instead of just rm the directory. My database is a DO managed instance.
I already have a non-root user setup along with an ssh key on my GitHub for that user.
Is there any documentation on DO or one that you would recommend to set this up?
Here’s my current workflow:
name: CI/CD
on:
push:
branches: ["main"]
pull_request:
branches: ["main"]
jobs:
CI:
runs-on: ubuntu-latest
environment: Django Test
services:
elasticsearch:
image: elasticsearch:7.17.9
ports:
- "9200:9200"
options: -e="discovery.type=single-node" --health-cmd="curl http://localhost:9200/_cluster/health" --health-interval=10s --health-timeout=5s --health-retries=10
postgres:
image: postgres:13
env:
POSTGRES_USER: postgres
POSTGRES_DB: postgres
POSTGRES_PASSWORD: postgres
ports:
- 5433:5432
options: --health-cmd pg_isready --health-interval 10s --health-timeout 5s --health-retries 5
strategy:
max-parallel: 4
matrix:
python-version: ["3.10"]
steps:
- uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v3
with:
python-version: ${{ matrix.python-version }}
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Run Tests
env:
SECRET_KEY: ${{ secrets.SECRET_KEY }}
POSTGRES_NAME: ${{ secrets.POSTGRES_NAME }}
POSTGRES_USER: ${{ secrets.POSTGRES_USER }}
POSTGRES_PASSWORD: ${{ secrets.POSTGRES_PASSWORD }}
POSTGRES_PORT: ${{ secrets.POSTGRES_PORT }}
DJANGO_SETTINGS_MODULE: ${{ secrets.DJANGO_SETTINGS_MODULE }}
CELERY_BROKER_URL: ${{ secrets.CELERY_BROKER_URL }}
YOUTUBE_V3_API_KEY: ${{ secrets.YOUTUBE_V3_API_KEY}}
run: |
coverage run manage.py test && coverage report --fail-under=90
This textbox defaults to using Markdown to format your answer.
You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!
These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.
Sign up for Infrastructure as a Newsletter.
Working on improving health and education, reducing inequality, and spurring economic growth? We'd like to help.
Get paid to write technical tutorials and select a tech-focused charity to receive a matching donation.
Hi there,
Quickly jumping in here! Great to hear that you’ve got your deployment process working with SSH,
git pull
, and Docker Compose! To minimize downtime during deployment, there are several strategies you can implement.Building a Docker image can take a significant amount of time, during which your application might be unavailable.
Stopping and starting containers also contributes to downtime.
What I could suggest here is:
Instead of building images on the production server, consider using a GitHub Workflow to build Docker images and push them to a registry. Then, your production server can simply pull the latest image, reducing the build time and the load on the production server.
Docker Compose supports adding some meta information for the updates, which can help minimize downtime by updating containers one by one rather than all at once:
Implement health checks in your Docker configuration. This ensures that traffic is only routed to the container once it’s fully ready to handle requests:
Here’s an example
deploy.sh
script that you can use on your server once you’ve offloaded the build stage to a GitHub action and have your images available on a container registry:docker-compose up -d
to restart your containers with minimal downtime.--no-deps
flag prevents Docker Compose from also recreating linked services.--build
flag is optional if you are building images on the fly. If you are pulling pre-built images from a registry, you can omit this flag.Hope that this helps!
Best,
Bobby
Heya,
A possible solution to keep the data from your Redis and Elasticsearch services when running your Django application on a DigitalOcean Droplet, you can use Docker volumes.
Docker volumes allow you to store data that is independent of the Docker containers themselves, ensuring that the data persists even if you restart or remove the containers.
Happy holidays!