How to interact with your application’s database¶
The database for your Divio application runs:
in a Docker container for your local projects: Interact with the local database
on a dedicated cluster for your cloud-deployed sites: Interact with the Cloud database
In either case, you will mostly only need to interact with the database using the tools provided by your project’s runtime stack (e.g. Django). However, if you need to interact with it directly, the option exists.
The database service name in
The Divio CLI expects that the database service will be named
db) in your
docker-compose.yml file. If not, it certain commands (such as
divio project push/pull db) will fail.
Interact with the local database¶
Generally, the most convenient way to interact with the application’s database is to do it locally (with a local copy of your cloud data if necessary).
From the project’s local Django web container¶
docker-compose run --rm web python ./manage.py dbshell
Connecting to a Postgres database manually¶
You can also make the connection manually from within the
web container, for example:
docker-compose run --rm web psql -h database_default -U postgres db
-h value (for host) needs to match the name of the database service in the
docker-compose.yml file, which
might be different (for example,
As well as
psql you can run commands such as
pg_restore. This is useful
for a number of common operations, below.
Your project may not have the
psql client installed already, in which case you will need to install it first. See
How to install system packages in a project.
Another way of interacting with the database is via the database container itself, using
exec. This requires that the database container already be up and running.
For example, if your database container is called
docker exec -i example_database_default_1 psql -U postgres
From your host environment¶
If you have a preferred database management tool that runs on your own computer, you can also connect to the database from outside the application.
Expose the database’s port to the host¶
In order to the connect to the database from a tool running directly on your own machine, you will need to expose its port (5432 by default for Postgres).
Add a ports section to the database service in
docker-compose.yml and map the
port to your host. For Postgres, for example:
database_default: image: postgres:9.6 ports: - 5432:5432
This means that external traffic reaching the container on port 5432 will be routed to port 5432 internally.
The ports are
<host port>:<container port> - you can choose another host
port if you are already using that port on your host.
Now restart the database container with:
docker-compose up -d database_default
Connect to a Postgres database¶
You will need to use the following details:
password: not required
Access the database using your Postgres tool of choice. Note that you must
specify the host address,
For example, if you’re using the
psql command line tool, you can connect to the project
psql -h 127.0.0.1 -U postgres db
Interact with the Cloud database¶
divio project pull db and
divio project push db commands to copy a database between a cloud environment
and your own local environment.
Note that the
pull operation downloads a binary database dump (in a tarred archive), whereas
push creates and
uploads a SQL database dump.
See the divio CLI command reference for more on using these commands.
From the project’s Cloud application container¶
Log into your Cloud project’s container (Test or Live) over SSH.
dbshell in a Django project¶
This will drop you into a command-line client, connected to your database.
Connecting to a database manually¶
You can also make the connection manually. Run
env to list your environment variables. Amongst
them you’ll find
DATABASE_URL, which will be in the form:
You can use these credentials in the appropriate client, e.g.
From your own computer¶
Access to cloud databases other than from the associated application containers is not possible - it is restricted, for security reasons, to containers running on our own infrastructure.
Change the local database engine version¶
Sometimes, you will need to change the database engine, or its version number, that your local project uses - for example if the cloud database is updated or changed. If the two database engines are not the same, you may run into problems.
The local database engine is specified by the
image option in the database service (usually called
docker-compose.yml file, for example:
database_default: image: postgres:9.6-alpine
Should you need to change this, that line should be updated - for example if the Cloud database is now running Postgres 11:
database_default: image: postgres:11-alpine
Docker will use the new version the next time the local project is launched.
If you are not sure what image to use for the local database, Divio support will be able to advise you.
In the Divio architecture, the
docker-compose.yml file is not
used for Cloud deployments, but only for the local server. The changes you
make here will not affect the Cloud database.
Manage Postgres extensions¶
Although you cannot create extensions yourself on our shared database clusters, we can often enable extensions for you on request. The most commonly-requested of these is PostGIS. Please contact Divio support for this.
You will run into errors if you perform an operation that requires or tries to create a missing extension, for example:
psycopg2.errors.InsufficientPrivilege: permission denied to create extension "unaccent"
from a database migration or
---> Processing error!
divio push db command, when the local database uses an extension not available on the cloud.
Run the Postgres
\dx command in a local database shell or in a cloud shell to list
extensions that you’re using.
Usage examples for common basic operations¶
It’s beyond the scope of this article to give general guidance on using the database, but these examples will help give you an idea of some typical operations that you might undertake while using Divio.
All the examples assume that you are interacting with the local database, running in its
container, and will use Postgres.
In each case, we launch the command from within the
web container with
--rm web and we specify:
Dump the database¶
web service, dump the database
db to a file named
docker-compose run --rm web pg_dump -h database_default -U postgres db > database.dump
Drop the database¶
Drop (delete) the database named
docker-compose run --rm web dropdb -h database_default -U postgres db
Create the database¶
Create a database named
docker-compose run --rm web createdb -h database_default -U postgres db
hstore extension (required on a newly-created local database) to the database named
docker-compose run --rm web psql -h database_default -U postgres db -c "CREATE EXTENSION hstore"
Restore the database¶
Restore a database named
db from a file named
docker-compose run --rm web pg_restore -h database_default -U postgres -d db database.dump --no-owner
Reset the database¶
docker-compose run --rm web python manage.py migrate
Restore from a downloaded Cloud backup¶
Untar the downloaded
backup.tar file. It contains a
database.dump file. Copy the file to
your local project directory, then run the commands above to drop and create the database, create the the hstore extension, and then
restore from a file.