How to migrate (or create) and deploy a Django project

This guide will take you through the steps to deploy a portable, vendor-neutral Twelve-factor Django project. It includes configuration for:

and deployment using Docker.

This guide assumes that you are familiar with the basics of the Divio platform and have Docker and the Divio CLI installed. If not, please start with our complete tutorial for Django, or at least ensure that you have the basic tools in place.

Edit (or create) the project files

Start in an existing Django project, or if necessary, create a new directory.

The Dockerfile

Create a file named Dockerfile, adding:

FROM python:3.8
COPY . /app
RUN pip install -r requirements.txt

(change the version of Python if required).

Python requirements in requirements.txt

The Dockerfile expects to find a requirements.txt file, so add one if required. Where indicated below, choose the appropriate options to install the components for Postgres/MySQL, and uWSGI/Uvicorn/Gunicorn, for example:


# Select one of the following for the database

# Select one of the following for the gateway server

Check that the version of Django is correct, and include any other Python components required by your project.

Local container orchestration with docker-compose.yml

Create a docker-compose.yml file, for local development purposes. This will replicate the web image used in cloud deployments, allowing you to run the application in an environment as close to that of the cloud servers as possible. Amongst other things, it will allow the project to use a Postgres or MySQL database (choose the appropriate lines below) running in a local container, and provides convenient access to files inside the containerised application.

version: "2.4"
    # the application's web service (container) will use an image based on our Dockerfile
    build: "."
    # map the internal port 80 to port 8000 on the host
      - "8000:80"
    # map the host directory to app (which allows us to see and edit files inside the container)
      - ".:/app:rw"
      - "./data:/data:rw"
    # the default command to run whenever the container is launched
    command: python runserver
    # the URL 'postgres' or 'mysql' will point to the application's db service
      - "database_default"
    env_file: .env-local

    # Select one of the following db configurations for the database
    image: postgres:9.6-alpine
      POSTGRES_DB: "db"
      SERVICE_MANAGER: "fsm-postgres"
      - ".:/app:rw"

    image: mysql:5.7
      MYSQL_DATABASE: "db"
      SERVICE_MANAGER: "fsm-mysql"
      - ".:/app:rw"
      - "./data/db:/var/lib/mysql"
        test: "/usr/bin/mysql --user=root -h --execute \"SHOW DATABASES;\""
        interval: 2s
        timeout: 20s
        retries: 10

Local configuration using .env-local

As you will see above, the web service refers to an env_file containing the environment variables that will be used in the local development environment. Create a .env-local file, containing:

# Select one of the following for the database


Build with Docker

Now you can build the application containers locally:

docker-compose build

Create a minimal Django project if required

If you need to create a new Django project, you can run the startproject command inside the Docker application’s container:

docker-compose run web django-admin startproject myapp .


Edit your settings file (for example, myapp/, to add some code that will read configuration from environment variables, instead of hard-coding it. Add some imports:

import os
import dj_database_url
from django_storage_url import dsn_configured_storage_class

Some security-related settings. The cloud environments will provide some of these values as environment variables where appropriate; in all cases they will fall back to safe values if an environment variable is not provided:

# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = os.environ.get('SECRET_KEY', '<a string of random characters>')

# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = os.environ.get('DJANGO_DEBUG') == "True"

DIVIO_DOMAIN = os.environ.get('DOMAIN', '')
    for d in os.environ.get('DOMAIN_ALIASES', '').split(',')
    if d.strip()

# Redirect to HTTPS by default, unless explicitly disabled
SECURE_SSL_REDIRECT = os.environ.get('SECURE_SSL_REDIRECT') != "False"

Configure database settings:

# Configure database using DATABASE_URL; fall back to sqlite in memory when no
# environment variable is available, e.g. during Docker build
DATABASE_URL = os.environ.get('DATABASE_URL', 'sqlite://:memory:')

DATABASES = {'default': dj_database_url.parse(DATABASE_URL)}

Configure static and media settings. First, add the WhiteNoiseMiddleware to the list of MIDDLEWARE, after the SecurityMiddleware:


and then:

STATIC_URL = '/static/'
STATIC_ROOT = os.path.join(BASE_DIR, 'staticfiles')

# Media files

# read the setting value from the environment variable

# dsn_configured_storage_class() requires the name of the setting
DefaultStorageClass = dsn_configured_storage_class('DEFAULT_STORAGE_DSN')

# Django's DEFAULT_FILE_STORAGE requires the class name
DEFAULT_FILE_STORAGE = 'myapp.settings.DefaultStorageClass'

# only required for local file storage and serving, in development
MEDIA_URL = 'media/'
MEDIA_ROOT = os.path.join('/data/media/')

(Note that the DEFAULT_FILE_STORAGE assumes your Django project was named myapp.)

Add a URL pattern for serving media files in local development

You will need to edit the project’s (e.g. myapp/

from django.conf import settings
from django.conf.urls.static import static

urlpatterns = [

if settings.DEBUG:
    urlpatterns.extend(static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT))

Extend the Dockerfile

Append to a command to the Dockerfile that will collect static files. Finally, depending which application gateway server you installed above, include the appropriate command to launch the application when a container starts:

RUN python collectstatic --noinput

# Select one of the following application gateway server commands
CMD uwsgi --http= --module=myapp.wsgi
CMD gunicorn --bind= --forwarded-allow-ips="*" myapp.wsgi
CMD uvicorn --host= --port=80 myapp.asgi:application

(Note that this assumes your Django project was named myapp.)

Run database migrations if required

The database may need to be migrated before you can start any application development work:

docker-compose run web python migrate

And create a Django superuser:

docker-compose run web python createsuperuser

Or, you can import the database content from an existing database.

Check the local site

You can now start up the site locally to test it:

docker-compose up

and log into the admin at

All the site’s configuration (Debug mode, ALLOWED_HOSTS, database settings, etc) is being provided by the environment variables in the .env-local file. On the cloud, the environment variables will be provided automatically by each environment.

Deployment and further development

Create a new project on Divio

In the Divio Control Panel add a new project, selecting the Build your own option.

Add database and media services

The new project does not include any additional services; they must be added manually. Use the Services menu to add a Postgres or MySQL database to match your choice earlier, and an S3 object storage instance for media.

Connect the local project to the cloud project

Your Divio project has a slug, based on the name you gave it when you created it. Run divio project list -g to get your project’s slug; you can also read the slug from the Control Panel.


divio project configure

and provide the slug. (This creates a new file in the project at .divio/config.json.)

If you have done this correctly, divio project dashboard will open the project in the Control Panel.

Configure the Git repository

Initialise the project as a Git repository if it’s not Git-enabled already:

git init .

A .gitignore file is needed to exclude unwanted files from the repository. Add:

# Python

# Django

# Divio

# OS-specific patterns - add your own here

Add the project’s Git repository as a remote, using the slug value in the remote address:

git remote add origin<slug>.git

(Use e.g. divio instead if you already have a remote named origin.)

Commit your work

git add .                                                 # add all the newly-created files
git commit -m "Created new project"                       # commit
git push --set-upstream --force origin [or divio] master  # push, overwriting any unneeded commits made by the Control Panel at creation time

You’ll now see “1 undeployed commit” listed for the project in the Control Panel.

Deploy the Test server

Deploy with:

divio project deploy

(or use the Deploy button in the Control Panel).

Once deployed, your project will be accessible via the Test server URL shown in the Control Panel (append /admin).

Working with the database on the cloud

Your cloud project does not yet have any content in the database, so you can’t log in or do any other work there. You can push the local database with the superuser you created to the Test environment:

divio project push db

or, SSH to a cloud container in the Test environment with divio project ssh and execute Django migrations and create a superuser there in the usual way.

You can run migrations automatically on deployment by adding a release command in the Control Panel.

Notes on working with the project

Using the Twelve-factor model places all configuration in environment variables, so that the project can readily be moved to another host or platform, or set up locally for development. The configuration for:

  • security
  • database
  • media
  • static files

settings is handled by a few simple code snippets in In each case, the settings will fall back to safe and secure defaults.

Application container

In both local and cloud environments, the application will run in a web container, using the same image and exactly the same codebase.

Django server

In cloud environments: the Dockerfile contains a CMD that starts up Django using the uWSGI/Gunicorn/Uvicorn application gateway server.

In the local environment: the command line in docker-compose.yml starts up Django using the runserver, overriding the CMD in the Dockerfile. If the command line is commented out, docker-compose up will use the application gateway server locally instead.


In cloud environments: the application will use one of our database clusters.

In the local environment: the application will use a container running the same database.

During the build phase: the database falls back to in-memory SQLite, as there is no database available to connect to, and no configuration variables available from the environment in any case.

Security settings

Debug mode

In cloud environments: the application will safely fall back to DEBUG = False.

In the local environment: .env-local supplies a DJANGO_DEBUG variable to allow Django to run in debug mode.

Secret key

In cloud environments: a random SECRET_KEY variable is always provided and will be used.

In the local environment: where no SECRET_KEY environment variable is provided, the application will fall back to a hard-coded key in

Allowed hosts

In cloud environments: DOMAIN and DOMAIN_ALIASES variable are always provided and will be used.

In the local environment: default values are provided via the DOMAIN_ALIASES environment variable in .env-local.

Static files

In cloud environments: the application gateway server and WhiteNoise are used.

In the local environment: static files are served by the Django runserver. By running the application gateway server locally and enforcing DEBUG = False, it can be tested with WhiteNoise in the local environment.

Media files

In cloud environments: file storage and serving is handled by the S3 instance.

In the local environment: the local filesystem is used for storage, and Django’s runserver is used to serve media. If a cloud environment’s DEFAULT_STORAGE_DSN is applied in the .env-local file, the local server will use the S3 instance instead.

Database migrations

In its current state, database migrations are not executed automatically in cloud deployments. To run migrations automatically, add a release command: python migrate. Alternatively you can run the command manually in the cloud environment using SSH.