
Deploying a Django Application with WebSockets, Redis, Celery, Celery Beat and PostgreSQL on VPS using Docker
How to Deploy a Django Application with Celery, WebSockets, Redis, and PostgreSQL on VPS Using Docker
Deploying a Django application with a high-tech stack including Celery, WebSockets, Redis, and PostgreSQL can seem daunting, but Docker simplifies the process significantly. In this guide, we'll walk through containerizing and deploying a production ready Django application on a VPS.
Prerequisites
- A VPS running Ubuntu 20.04/22.04 (2GB RAM minimum recommended)
- Docker and Docker Compose installed
- Basic familiarity with Django, Docker, and Linux commands
- Domain name pointed to your VPS (optional for production)
Step 1: Prepare Your Django Project
Ensure your Django project is ready for production:
# settings.py
import os
from pathlib import Path
# Base
BASE_DIR = Path(__file__).resolve().parent.parent
SECRET_KEY = os.environ['DJANGO_SECRET_KEY']
DEBUG = bool(int(os.environ.get('DJANGO_DEBUG', 0)))
ALLOWED_HOSTS = os.environ['DJANGO_ALLOWED_HOSTS'].split(',')
# Database (PostgreSQL)
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ['POSTGRES_DB'],
'USER': os.environ['POSTGRES_USER'],
'PASSWORD': os.environ['POSTGRES_PASSWORD'],
'HOST': os.environ.get('POSTGRES_HOST', 'db'),
'PORT': os.environ.get('POSTGRES_PORT', '5432'),
}
}
# Redis (for Cache/Celery)
REDIS_URL = f"redis://:{os.environ['REDIS_PASSWORD']}@{os.environ.get('REDIS_HOST', 'redis')}:{os.environ.get('REDIS_PORT', '6379')}/0"
# Celery
CELERY_BROKER_URL = os.environ.get('CELERY_BROKER_URL', REDIS_URL)
CELERY_RESULT_BACKEND = os.environ.get('CELERY_RESULT_BACKEND', REDIS_URL)
# Security (for production)
if not DEBUG:
CSRF_COOKIE_SECURE = True
SESSION_COOKIE_SECURE = True
SECURE_SSL_REDIRECT = True
Step 2: Create Docker Configuration
Create a Dockerfile
for your Django application:
# Dockerfile
FROM python:3.11-slim
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
# Set work directory
WORKDIR /app
# Install system dependencies
RUN apt-get update && \
apt-get install -y build-essential libpq-dev gcc libxml2-dev libxslt1-dev libjpeg-dev zlib1g-dev libffi-dev libssl-dev git && \
rm -rf /var/lib/apt/lists/*
# Install Python dependencies
COPY requirements.txt ./
RUN pip install --upgrade pip && pip install -r requirements.txt
# Copy project
COPY . .
# Collect static files
RUN python manage.py collectstatic --noinput || true
# Expose port (Django default)
EXPOSE 8000
# Default command (can be overridden in docker-compose)
CMD ["gunicorn", "<project_name>.wsgi:application", "-b", "0.0.0.0:8000"]
Step 3: Configure Docker Compose
Create a docker-compose.yml
file to orchestrate all services:
# docker-compose.yml
services:
db:
image: postgres:latest
container_name: postgres_db_pyzit
restart: always
env_file:
- .env
ports:
- "5432:5432"
volumes:
- postgres_data:/var/lib/postgresql/data
redis:
image: redis:latest
container_name: redis_db_pyzit
restart: always
env_file:
- .env
ports:
- "6379:6379"
volumes:
- redis_data:/data
command: redis-server --requirepass ${REDIS_PASSWORD}
web:
build: .
# command: gunicorn <project_name>.wsgi:application -b 0.0.0.0:8000
command: uvicorn <project_name>.asgi:application --host 0.0.0.0 --port 8000
# command: python manage.py runserver 0.0.0.0:8000
restart: always
volumes:
- .:/app
ports:
- "8000:8000"
env_file:
- .env
depends_on:
- db
- redis
celery:
build: .
command: celery -A <project_name> worker -l info
restart: always
volumes:
- .:/app
env_file:
- .env
depends_on:
- db
- redis
celery-beat:
build: .
command: celery -A <project_name> beat -l info
restart: always
volumes:
- .:/app
env_file:
- .env
depends_on:
- db
- redis
volumes:
postgres_data:
redis_data:
Step 4: Create .env file
Create an .env
file in root directory or where Dockerfile and docker.compose.yml exist:
# PostgreSQL Configuration
POSTGRES_DB=<db_name>
POSTGRES_USER=<user_name>
POSTGRES_PASSWORD=<strong_password>
POSTGRES_HOST=db
POSTGRES_PORT=5432
# Redis Configuration
REDIS_PASSWORD=strong_redis_password_here
REDIS_HOST=redis
REDIS_PORT=6379
# Django Configuration (for web service)
DJANGO_SECRET_KEY=your-django-secret-key-here-make-it-very-long-and-random
DJANGO_DEBUG=0
DJANGO_ALLOWED_HOSTS=localhost,127.0.0.1
# Celery Configuration (for celery services) #optiona
CELERY_BROKER_URL=redis://:strong_redis_password_here@redis:6379/0
CELERY_RESULT_BACKEND=redis://:strong_redis_password_here@redis:6379/0
Step 5: Configure Nginx
Create an nginx.conf
file for handling HTTP/HTTPS and WebSocket connections:
# nginx.conf
events {
worker_connections 1024;
}
http {
upstream django {
server web:8000;
}
server {
listen 80;
server_name yourdomain.com;
location / {
proxy_pass http://django;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
location /ws/ {
proxy_pass http://django;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $host;
}
location /static/ {
alias /app/static/;
}
}
}
Step 6: Deploy to VPS
On your VPS, follow these steps:
Install Docker and Docker Compose:
# Add Docker's official GPG key: sudo apt-get update sudo apt-get install ca-certificates curl sudo install -m 0755 -d /etc/apt/keyrings sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc sudo chmod a+r /etc/apt/keyrings/docker.asc # Add the repository to Apt sources: echo \ "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu \ $(. /etc/os-release && echo "${UBUNTU_CODENAME:-$VERSION_CODENAME}") stable" | \ sudo tee /etc/apt/sources.list.d/docker.list > /dev/null sudo apt-get update # Install the Docker packages. sudo apt-get install docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin
- Transfer your project files to the VPS using Git or SCP
Build and start the containers:
docker-compose build docker-compose up -d
Run database migrations:
docker-compose exec web python manage.py migrate
Create a superuser (optional):
docker-compose exec web python manage.py createsuperuser
Step 6: Set Up SSL (Optional but Recommended)
For production, add SSL using Let's Encrypt:
# Stop nginx container temporarily
docker-compose stop nginx
# Install certbot
sudo apt install certbot
# Obtain certificates
sudo certbot certonly --standalone -d yourdomain.com
# Update nginx.conf to include SSL configuration
# (Add a new server block listening on 443 with SSL settings)
# Restart everything
docker-compose up -d
Production Considerations
- Implement proper secret management (use Docker secrets or environment variables)
- Set up regular database backups
- Configure monitoring for your services
- Implement CI/CD pipeline for automated deployments
- Consider scaling options (horizontal scaling with more workers)
Conclusion
You now have a production-ready Django application with WebSockets (via Django Channels), Redis for caching and channel layers, and PostgreSQL for data persistence, all running in Docker containers on your VPS. This setup provides a solid foundation that you can scale as your application grows.
Remember to regularly update your Docker images and monitor your application's performance. Happy deploying!

Wajahat Murtaza
FounderShare this Post
Similar Posts
-
How to Embed DevSecOps into Your Small business Tech Stack
-
A Fast, Free, and Powerful Online Playground for HTML, CSS & JavaScript
-
Boost Email Campaign Performance with a Bulk Sender + Analytics Combo
-
How I Use Online Invoicing to Save Time as a Freelancer (and You Can Too)
-
DevKit by Pyzit: A Fast, Privacy-First Toolkit for Developers
Categories
Was this blog helpful?
Your feedback is anonymous and no login is required.
Thank You for Your Feedback!
Your feedback helps us improve and serve you better.
You Already Submitted Feedback!
We appreciate your input. You can view the blog post and leave more comments later.