This repository contains a docker-compose stack for developing TopoBank and plugins.
The modules
are included as submodules. The stack consists of
- the main topobank instance,
- Celery workers and beat,
- a PostgreSQL database,
- a Redis server and
- a Minio S3 server.
This repository uses git submodules. You need to initialize and update them after cloning this repository, i.e. with
git clone https://github.com/ContactEngineering/topobank-stack-development.git cd topobank-stack-development/ git submodule init git submodule update
or use a command like
git clone --recurse-submodules [email protected]:ContactEngineering/topobank-stack-development.git
If you need to clone without github credentials available locally, please modify URLs within the .gitmodules file to point to public addresses, e.g.
https://github.com/ContactEngineering/topobank.git
instead of
[email protected]:ContactEngineering/TopoBank.git
and run
git submodule sync --recursive
afterwards.
After initializing and updating the submodules, use
git submodule foreach git pull origin main
to pull all current main branches from remote.
Enter the project directory:
cd topobank-stack-development
In order to run the application, copy the orcid configuration template by
cp .envs/.orcid.template .envs/.orcid
and edit the file .envs/.orcid
and replace the values here:
ORCID_CLIENT_ID=<replace with your client ID from ORCID> ORCID_SECRET=<replace with your secret from ORCID>
In order to find these values, register a public API client (see next chapter).
If you want to create test DOIs from the application, also copy and the datacite configuration template by
cp .envs/.datacite.template .envs/.datacite
This is Optional, by default the DOI generation is skipped.
The rest of the settings in .envs/.django
should be fine as default for development.
You need to register a public API client on the ORCID website for the following purposes:
- get a client API + secret in order to be able to authenticate against orcid.org
- set a redirect URL to which Topobank will redirect after successful authentication
See here for more information how to do it. As redirect URL add http://127.0.0.1:8000/accounts/orcid/login/callback
To compile the stack, run
TOPOBANK_UID=$(id -u) TOPOBANK_GID=$(id -g) docker compose build
Note that you need the compose
plugin of docker or the (old) standalone docker-compose that can be
installed via pip
.
You could also copy the template file .env.template
to .env
and fill in these two numbers, so you don't have to prefix the docker compose
commands.
You can find the ids by calling the id
command on Linux, this will return the uid
and possible
gid
values.
Important: If you have VPN active, this may shadow the Docker mirror sites and break the build process (because the images cannot be pulled).
Run the whole stack with
docker compose up -d
The stack automatically initializes the database and creates an S3 bucket.
Especially the first time, this could take a while. To see what is going on, you can look at the logs with:
docker compose logs -f
If you only want to see the logs of one Service, i.e. django, run:
docker compose logs -f django
You are now able to log in with via ORCID and upload data, but you will not have access to any analysis functionality yet.
When running the first time, in order to see the analysis function
from the plugins make sure that you've added an organization World
, which
is linked to the group all
and add permissions for all commonly available plugins:
First give your development user admin permissions such that you can enter the admin interface:
docker compose exec django python manage.py grant_admin_permissions your_username
You have to replace
your_username
with the correct username. In order to find it, login with your ORCID and enter the "User Profile" page and take the last part of the URL. Example: If the URL ishttps://127.0.0.1:8000/users/anna/
, thenyour_username
isanna
.After granting the permission, you can enter the admin page. The link to the admin page can be found by this user in the menu item which is named after the user.
In the
Organization
model, create a new organization with nameWorld
. As available plugins, enter e.g.topobank_contact, topobank_statistics
. Pay attention to suing underscores where otherwise dashes appear. As group, chooseall
.
Then all users, including the anonymous user, will be able the use the mentioned plugins.
To have the topobank platform communicate with the local minio s3 server,
you will aso have to add topobank-minio-alias
as another name for localhost
to your /etc/hosts
file, e.g.
127.0.0.1 localhost topobank-minio-alias
List all submodules in the .envs/.django
in a line
TOPOBANK_PLUGINS="topobank-statistics topobank-contact topobank-publication ce-ui"
separated by whitespace.
When requirements in submodules change, update
requirements/development.txt
by providing pip-compile
and running make
from within requirements
.
Plugins with private dependencies may require access tokens provided
in environment variables during this process. These secret tokens
will be embedded as clear text in requirements/development.txt
.
Thus, do not commit this requirements file.
Make sure all submodules point to the head of the respective branch you want to use in your development stack.
docker compose run --rm django python manage.py shell
To configure tests in PyCharm, please consider the following:
- In File->Settings->Docker->Tools enable Docker Compose V2
- Create a new interpreter On Docker Compose... that runs within the compose configuration
- Create a new pytest configuration that runs within this interpreter. Add DJANGO_SETTINGS_MODULE=topobank.settings.test to the environment.
- For testing plugins, also add PYTHONPATH=/development-stack/topobank to the environment.
Copy the database dump file to the /backups location in the PostgreSQL container:
docker cp file.sql container:/backups
Open a shell in the PostgreSQL container:
docker compose exec postgres /bin/bash
Run the import:
PGPASSWORD=$POSTGRES_PASSWORD psql -h $POSTGRES_HOST -p $POSTGRES_PORT -U $POSTGRES_USER --dbname $POSTGRES_DB
The web app includes functionality for publishing datasets, which means registering a DOI for those. The true DOI registration only works in the production environment.
Development of this project is funded by the European Research Council within Starting Grant 757343.