Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

library, docs & test updates #29

Merged
merged 8 commits into from
Jun 28, 2024
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
92 changes: 47 additions & 45 deletions .github/workflows/test-ubuntu.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,8 @@ on:
branches:
- master
paths-ignore:
- '.gitignore'
- '**.md'
- ".gitignore"
- "**.md"

jobs:
build:
Expand All @@ -20,10 +20,10 @@ jobs:
POSTGRES_DB: gis_test
ALLOW_IP_RANGE: 0.0.0.0/0
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432:5432
redis:
Expand All @@ -37,50 +37,52 @@ jobs:
- 6379:6379

steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v2

- name: Set up Python 3.12
uses: actions/setup-python@v2
with:
python-version: "3.12"

- name: Set up Python 3.10
uses: actions/setup-python@v2
with:
python-version: '3.10'
- name: Install and set up Poetry
run: |
curl -sSL https://install.python-poetry.org | python
$HOME/.local/bin/poetry config virtualenvs.in-project true

- name: Install and set up Poetry
run: |
curl -sSL https://install.python-poetry.org | python
$HOME/.local/bin/poetry config virtualenvs.in-project true
- name: Cache dependencies.py
uses: actions/cache@v2
with:
path: .venv
key: venv-3.12-${{ hashFiles('**/poetry.lock') }}

- name: Cache dependencies.py
uses: actions/cache@v2
with:
path: .venv
key: venv-3.10-${{ hashFiles('**/poetry.lock') }}
- name: Install dependencies.py
run: |
$HOME/.local/bin/poetry install

- name: Install dependencies.py
run: |
$HOME/.local/bin/poetry install
- name: Install osmium & osmctools
run: |
sudo apt-get update
sudo apt-get install -y -qq osmium-tool osmctools
echo $(osmium --version)

- name: Install osmium & osmctools
run: |
sudo apt-get update
sudo apt-get install -y -qq osmium-tool osmctools
echo $(osmium --version)
- name: linting
run: |
source .venv/bin/activate
pre-commit run --all-files

- name: linting
run: |
source .venv/bin/activate
pre-commit run --all-files
- name: pytest and coverage
run: |
source .venv/bin/activate
sudo python -m smtpd -n -c DebuggingServer localhost:1025 &
sudo docker volume create routing-packager_packages --driver local --opt type=none --opt device=$PWD --opt o=bind &
sudo docker volume create routing-packager_tmp_packages --driver local --opt type=none --opt device=$PWD --opt o=bind

- name: pytest and coverage
run: |
source .venv/bin/activate
sudo python -m smtpd -n -c DebuggingServer localhost:1025 &
sudo docker volume create routing-packager_packages --driver local --opt type=none --opt device=$PWD --opt o=bind
export API_CONFIG=test
pytest --cov=routing_packager_app --ignore=tests/test_tasks.py
coverage lcov --include "routing_packager_app/*"
export API_CONFIG=test
pytest --cov=routing_packager_app --ignore=tests/test_tasks.py
coverage lcov --include "routing_packager_app/*"

- name: coveralls
uses: coverallsapp/github-action@master
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
path-to-lcov: ./coverage.lcov
- name: coveralls
uses: coverallsapp/github-action@master
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
path-to-lcov: ./coverage.lcov
4 changes: 3 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -15,4 +15,6 @@ scan*.txt

# temp
.env_local
.env
.env

.vscode/
21 changes: 11 additions & 10 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,11 +1,12 @@
repos:
- repo: https://github.com/psf/black
rev: 22.3.0
hooks:
- id: black
language_version: python3
args: [routing_packager_app, tests]
- repo: https://github.com/pycqa/flake8
rev: 4.0.1 # pick a git hash / tag to point to
hooks:
- id: flake8
- repo: https://github.com/ambv/black
rev: 23.9.1
hooks:
- id: black
language_version: python3
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.0.289
hooks:
- id: ruff
args: [--fix]
29 changes: 20 additions & 9 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,12 @@
We :heart: patches, fixes & feature PRs and want to make sure everything goes smoothly for you before and while while submitting a PR.

For development we use:

- [`poetry`](https://github.com/python-poetry/poetry/) as package manager
- `pytest` for testing
- Google's [`yapf`](https://github.com/google/yapf) to make sure the formatting is consistent.
- [`pre-commit`](https://pre-commit.com) hook for yapf
- [`black`](https://github.com/psf/black) to make sure the formatting is consistent.
- [`ruff`](https://github.com/astral-sh/ruff) for linting
- [`pre-commit`](https://pre-commit.com) hook for formatting and linting

When contributing, ideally you:

Expand All @@ -23,11 +25,13 @@ When contributing, ideally you:
1. Create and activate a new virtual environment

2. Install development dependencies:

```bash
poetry install
```

3. Please add a pre-commit hook for `yapf`, so your code gets auto-formatted before committing it:

3. Please add a pre-commit hook, so your code gets auto-formatted and linted before committing it:

```bash
pre-commit install
```
Expand All @@ -37,15 +41,22 @@ pre-commit install
You'll need a few things to run the tests:

- PostreSQL installation with a DB named `gis_test` (or define another db name using `POSTGRES_DB_TEST`) **and PostGIS enabled**
- Redis database, best done with `docker run --name redis -p 6379:6379 -d redis:6.0`, then you can use the project's defaults, i.e. `REDIS_URL=redis://localhost:6379/0`
- some fake SMTP service to handle email tests, our recommendations:
- [fake-smtp-server](https://www.npmjs.com/package/fake-smtp-server): NodeJS app with a frontend on `http://localhost:1080` and SMTP port 1025
- pure Python one-liner in a separate terminal window: `sudo python -m smtpd -n -c DebuggingServer localhost:1025`
- Redis database

Both can be quickly spun up by using the provided `docker-compose.test.yml`:

```bash
docker compose -f docker-compose.test.yml up -d
```

You'll also need some fake SMTP service to handle email tests, our recommendation: [fake-smtp-server](https://www.npmjs.com/package/fake-smtp-server),
a NodeJS app with a frontend on `http://localhost:1080` and SMTP port 1025

We use `pytest` in this project with `coverage`:

```bash
export API_CONFIG=test
pytest --cov=routing_packager_app
```
```

A `coverage` bot will report the coverage in every PR and we might ask you to increase coverage on new code.
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ MAINTAINER Nils Nolde <[email protected]>
RUN apt-get update > /dev/null && \
export DEBIAN_FRONTEND=noninteractive && \
apt-get install -y libluajit-5.1-2 \
libzmq5 libczmq4 spatialite-bin libprotobuf-lite32 sudo locales wget \
libzmq5 libgdal-dev libczmq4 spatialite-bin libprotobuf-lite32 sudo locales wget \
libsqlite3-0 libsqlite3-mod-spatialite libcurl4 python-is-python3 osmctools \
python3.11-minimal python3-distutils curl unzip moreutils jq spatialite-bin supervisor > /dev/null

Expand Down
17 changes: 9 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ curl --location -XPOST 'http://localhost:5000/api/v1/jobs' \
--header 'Content-Type: application/json' \
--data-raw '{
"name": "test", # name needs to be unique for a specific router & provider
"description": "test descr",
"description": "test descr",
"bbox": "1.531906,42.559908,1.6325,42.577608", # the bbox as minx,miny,maxx,maxy
"provider": "osm", # the dataset provider, needs to be registered in ENABLED_PROVIDERS
"update": "true" # whether this package should be updated on every planet build
Expand All @@ -51,6 +51,7 @@ curl --location -XPOST 'http://localhost:5000/api/v1/jobs' \
After a minute you should have the graph package available in `./data/output/osm_test/`. If not, check the logs of the worker process or the Flask app.

The `routing-packager-app` container running the HTTP API has a `supervisor` process running in a loop, which:

- downloads a planet PBF (if it doesn't exist) or updates the planet PBF (if it does exist)
- builds a planet Valhalla graph
- then updates all graph extracts with a fresh copy
Expand All @@ -61,9 +62,9 @@ By default, also a fake SMTP server is started, and you can see incoming message

### Graph & OSM updates

Under the hood we're running a `supervisor` instance to control the graph builds.
Under the hood we're running a `supervisor` instance to control the graph builds.

Two instances of the [Valhalla docker image](https://github.com/gis-ops/docker-valhalla) take turns building a new graph from an updated OSM file. Those two graphs are physically separated from each other in subdirectories `$DATA_DIR/osm/8002` & `$DATA_DIR/osm/8003`.
Two instances of the [Valhalla docker image](https://github.com/gis-ops/docker-valhalla) take turns building a new graph from an updated OSM file. Those two graphs are physically separated from each other in subdirectories `$TMP_DATA_DIR/osm/8002` & `$TMP_DATA_DIR/osm/8003`.

After each graph build finished, the OSM file is updated for the next graph build.

Expand All @@ -80,9 +81,9 @@ The app is listening on `/api/v1/jobs` for new `POST` requests to generate some
1. Request is parsed, inserted into the Postgres database and the new entry is immediately returned with a few job details as blank fields.
2. Before returning the response, the graph generation function is queued with `ARQ` in a Redis database to dispatch to a worker.
3. If the worker is currently
- **idle**, the queue will immediately start the graph generation:
- Pull the job entry from the Postgres database
- Update the job's `status` database field along the processing to indicate the current stage
- Zip graph tiles from disk according to the request's bounding box and put the package to `$DATA_DIR/output/<JOB_NAME>`, along with a metadata JSON
- **busy**, the current job will be put in the queue and will be processed once it reaches the queue's head
- **idle**, the queue will immediately start the graph generation:
- Pull the job entry from the Postgres database
- Update the job's `status` database field along the processing to indicate the current stage
- Zip graph tiles from disk according to the request's bounding box and put the package to `$DATA_DIR/output/<JOB_NAME>`, along with a metadata JSON
- **busy**, the current job will be put in the queue and will be processed once it reaches the queue's head
4. Send an email to the requesting user with success or failure notice (including the error message)
14 changes: 14 additions & 0 deletions docker-compose.test.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
services:
postgres:
image: kartoza/postgis:14
environment:
POSTGRES_USER: admin
POSTGRES_PASS: admin
POSTGRES_DB: gis_test
ALLOW_IP_RANGE: 0.0.0.0/0
ports:
- 5432:5432
redis:
image: redis:6.2
ports:
- 6379:6379
6 changes: 3 additions & 3 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
volumes:
postgis-data:
packages: # do not change any detail of this volume
packages: # do not change any detail of this volume
driver: local
driver_opts:
type: none
Expand All @@ -11,7 +11,6 @@ volumes:
networks:
routing-packager:

version: '3.2'
services:
postgis:
image: kartoza/postgis:12.1
Expand All @@ -31,7 +30,8 @@ services:
restart: always
redis:
image: redis:6.2
container_name: routing-packager-redis
container_name:
routing-packager-redis
# mostly needed to define the database hosts
env_file:
- .docker_env
Expand Down
11 changes: 7 additions & 4 deletions main.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
from contextlib import asynccontextmanager
import uvicorn as uvicorn
from arq import create_pool
from arq.connections import RedisSettings
Expand All @@ -10,11 +11,9 @@
from routing_packager_app.config import SETTINGS
from routing_packager_app.api_v1.models import User

app: FastAPI = create_app()


@app.on_event("startup")
async def startup_event():
@asynccontextmanager
async def lifespan(app: FastAPI):
SQLModel.metadata.create_all(engine, checkfirst=True)
app.state.redis_pool = await create_pool(RedisSettings.from_dsn(SETTINGS.REDIS_URL))
User.add_admin_user(next(get_db()))
Expand All @@ -24,7 +23,11 @@ async def startup_event():
p = SETTINGS.get_data_dir().joinpath(provider.lower())
p.mkdir(exist_ok=True)
SETTINGS.get_output_path().mkdir(exist_ok=True)
yield
app.state.redis_pool.shutdown()


app: FastAPI = create_app(lifespan=lifespan)

if __name__ == "__main__":
uvicorn.run("main:app", host="0.0.0.0", port=5000, reload=True)
Loading
Loading