How to Dockerize Django in 5 minutes
Dockerizing a Django project can be a daunting task. A complex Django project can have many moving parts; the Django server, the database, perhaps Redis and a Celery worker.
This tutorial will show you how to Dockerize a Django project in less than 5 minutes. If you've been working with Django for a while, chances are you've heard of Docker before. But here's a quick summary of Docker and why you should consider using it in your project.
You can also watch the video tutorial on YouTube:
Docker in a nutshell
Docker is a very popular tool for containerising applications. Containers are powerful because your environment is setup exactly the same way every time the containers are started.
The benefits of this are:
- Your code runs on any operating system that supports Docker
- You save time by not needing to configure system dependencies on your host
- Your local and production environments can be exactly the same, eliminating errors that only happen in production
Understanding Docker
This tutorial is not in-depth about how Docker works. Instead the tutorial will focus on how to setup Docker specifically for Django.
If you would like to learn more about Docker, my recommendation is to read the official Python guide. It is a relatively short tutorial but covers everything you need to know - which is actually not that much!
Dockerizing Django
Whether it's an existing project or you're starting a new project, we'll be using the same resource to implement Docker into the project.
The resource we are going to use is Cookiecutter Django. If you are not familiar with Cookiecutter, it is a tool for bootstrapping projects from cookiecutters (project templates). It saves a lot of time when creating new projects because it configures a lot of boilerplate code for you.
One of the best parts of Cookiecutter Django is that it includes Docker configuration. We will be using this configuration to understand how Docker is implemented into a Django project.
Getting Started
Firstly, install Docker.
We are going to create two Django projects. The first is going to be a simple project created using the django-admin
command. The second project will be created using Cookiecutter Django.
Create the first project
virtualenv simpleenv
source simpleenv/bin/activate
pip install django
django-admin startproject simpleproject
Here we are creating a virtual environment. To activate the environment you will need to use the command for your operating system.
Create the second project
In a different folder, start by installing Cookiecutter with pip install cookiecutter
. This will install Cookiecutter globally so that it is accessible at any time.
We can now use any Cookiecutter template to bootstrap a project. In a new terminal run the following commands to create the project using Cookiecutter Django.
virtualenv advancedVenv
source advancedVenv/bin/activate
cookiecutter gh:pydanny/cookiecutter-django
Here we are using a separate virtual environment for this project. The command cookiecutter gh:pydanny/cookiecutter-django
uses the Cookiecutter command line utility to create a project using the GitHub template pydanny/cookiecutter-django
.
This command will prompt you to answer a few questions about the project you want to generate. By pressing enter you can leave each answer with the default value.
When prompted with the use_docker
option, make sure to press 'y' so that the project is configured with Docker.
After completing all of the prompts, a Django project will be generated. We are going to specifically look at the files created for configuring Docker. These are:
- The
compose
folder - The
.dockerignore
file - The
local.yml
file - The
production.yml
file
TLDR
This is all you need to Dockerize a Django project. Simply copy these folders and files into your other Django project and adjust them so that they point to the correct files.
If you want to see a more advanced Docker configuration, generate a Cookiecutter Django project with the use_celery
flag enabled. The Docker configuration will include a setup for Celery and Redis.
Understanding the Docker configuration
The compose
folder contains two folders, one for local development and one for production. Likewise, the local.yml
file is used in local development and the production.yml
file is used in production.
The compose/local
folder goes hand in hand with the local.yml
file.
The compose/production
folder goes hand in hand with the production.yml
file.
Docker-Compose is the most important tool to understand. We use it to run multi-container Docker applications. It is part of the command-line utility that comes with installing Docker.
Running the Project with Docker
Make sure you have the Docker app running on your computer, otherwise the following commands will not execute properly.
We use Docker-Compose to build the Image
of our project. Images
are like blueprints.
Once the Image
is built we then create a Container
which is basically a running instance of an Image
. If we make any changes to the dependencies of the project (for e.g Python dependencies) then we need to rebuild the Image
to put them into effect.
Build the Docker Image
by running:
docker-compose -f local.yml build
Notice that this command takes an argument with the -f
flag. This tells Docker to use the local.yml
file as the configuration file.
If we open the local.yml
file we have the following contents:
version: '3'
volumes:
local_postgres_data: {}
local_postgres_data_backups: {}
services:
django:
build:
context: .
dockerfile: ./compose/local/django/Dockerfile
image: my_awesome_project_local_django
container_name: django
depends_on:
- postgres
volumes:
- .:/app:z
env_file:
- ./.envs/.local/.django
- ./.envs/.local/.postgres
ports:
- "8000:8000"
command: /start
postgres:
build:
context: .
dockerfile: ./compose/production/postgres/Dockerfile
image: my_awesome_project_production_postgres
container_name: postgres
volumes:
- local_postgres_data:/var/lib/postgresql/data:Z
- local_postgres_data_backups:/backups:z
env_file:
- ./.envs/.local/.postgres
docs:
image: my_awesome_project_local_docs
container_name: docs
build:
context: .
dockerfile: ./compose/local/docs/Dockerfile
env_file:
- ./.envs/.local/.django
volumes:
- ./docs:/docs:z
- ./config:/app/config:z
- ./my_awesome_project:/app/my_awesome_project:z
ports:
- "7000:7000"
command: /start-docs
This file is a configuration file that lists out everything Docker needs to run our multicontainer application. Take note of the services
section. There are three services; django
, postgres
and docs
.
Under each service there are a few configuration options.
Again, if you want to dive into the specifics of each command then refer back to the Docker documentation.
If we take a look at the django
service we have the following:
django:
build:
context: .
dockerfile: ./compose/local/django/Dockerfile
This configures the service so that it uses a specific DockerFile. The DockerFile being used comes from the local
Docker configuration inside the compose
folder.
Hopefully this shows how all of the Docker configuration is connecting together. The local.yml
file contains services which point to specific DockerFiles inside the compose
folder. There are also other files used besides DockerFiles.
For example, at the end of the file compose/django/Dockerfile
we have the following:
...
COPY ./compose/production/django/entrypoint /entrypoint
RUN sed -i 's/\r$//g' /entrypoint
RUN chmod +x /entrypoint
...
ENTRYPOINT ["/entrypoint"]
This tells Docker that when this DockerFile is used by Docker-Compose it will call the entrypoint
script, which can be found inside compose/production/django/entrypoint
. Open that file and take a look at the contents. You'll see that it basically logs when the Postgres database has been successfully connected to.
Taking another look at the django
service:
django:
build:
command: /start
An important part of the django
service is the command
property. This tells Docker that the starting command for this container is the start
script. We can find this file inside compose/local/django
. Inside this file we have the following:
#!/bin/bash
set -o errexit
set -o pipefail
set -o nounset
python manage.py migrate
python manage.py runserver_plus 0.0.0.0:8000
This should look very familiar. We have the Django migrations and the server being run. Something to note here is that the runserver_plus
command comes from Django Extensions. You can replace runserver_plus
with runserver
if you do not have the package installed.
Do not remove the
0.0.0.0:8000
because it is needed for the container to map the ports to the host.
Now that we understand how Docker is configured, the last part is to run this command to start the multicontainer application:
docker-compose -f local.yml up
This will run all of the services inside the local.yml
file. After running this command you can go to your localhost in the browser and you should see the default landing page load.
With this setup you can run the Django server, the Postgres database and documentation.
Final Changes
You will need to configure the Docker files for your project. Some things to take note of:
Environment variables
The Docker Compose files load environment variable files into the containers. These environment variable files are stored in the .envs
folder generated by Cookiecutter Django. To be able to read these values you will need to install a package that handles environment variables.
The package recommended by Cookiecutter Django is Django-Environ. You can install this package with:
pip install django-environ
Database settings
The database credentials are also included as environment variables so make sure to have the correct database settings.
DATABASES = {"default": env.db("DATABASE_URL")}
DATABASES["default"]["ATOMIC_REQUESTS"] = True
Allowed hosts
Make sure your allowed hosts include localhost.
ALLOWED_HOSTS = ["localhost", "0.0.0.0", "127.0.0.1"]
Ultimately Docker relies on two components: Docker-Compose and DockerFiles. We have local.yml
for local development. This file points to the compose/local
folder for everything it needs to run Docker locally. Likewise we have production.yml
for production and it uses the compose/production
folder.
All the credit goes to the Cookiecutter Django project. I highly recommend using it in your own projects. Not only is it a great resource for professional development but it can be used to learn many best practices including how to configure Docker in a Django project.