Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add logging endpoint #41

Open
wants to merge 5 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 0 additions & 2 deletions .docker_env
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,6 @@ ADMIN_PASS=admin
# DATA_DIR=/home/nilsnolde/dev/gis-ops/routing-graph-packager/data
VALHALLA_URL="http://app"

VALHALLA_IMAGE=gisops/valhalla:latest
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

superfluous


POSTGRES_DB=gis
POSTGRES_USER=admin
POSTGRES_PASS=admin
Expand Down
7 changes: 1 addition & 6 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,12 +1,7 @@
repos:
- repo: https://github.com/ambv/black
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just use ruff for linting and formatting

rev: 23.9.1
hooks:
- id: black
language_version: python3
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.0.289
rev: v0.7.4
hooks:
- id: ruff
args: [--fix]
39 changes: 19 additions & 20 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,22 +1,21 @@
#--- BEGIN Usual Python stuff ---

FROM ghcr.io/valhalla/valhalla:latest as builder
LABEL [email protected]
FROM ghcr.io/valhalla/valhalla:latest AS builder
LABEL maintainer="Nils Nolde <[email protected]>"
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

old format was deprecated


WORKDIR /app

# Install vis
RUN apt-get update -y > /dev/null && \
apt-get install -y \
apt-transport-https \
ca-certificates \
python-is-python3 \
python3-pip \
python3-venv \
curl > /dev/null && \
python -m pip install --upgrade pip --break-system-packages
apt-transport-https \
ca-certificates \
python-is-python3 \
python3-pip \
python3-venv \
curl > /dev/null
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also contains the ubuntu 24.04 lts update


ENV POETRY_BIN /root/.local/bin/poetry
ENV POETRY_BIN=/root/.local/bin/poetry
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

old format was deprecated


RUN curl -sSL https://install.python-poetry.org | python && \
$POETRY_BIN config virtualenvs.create false && \
Expand All @@ -42,21 +41,21 @@ RUN . app_venv/bin/activate && \

# remove some stuff from the original image
RUN cd /usr/local/bin && \
preserve="valhalla_service valhalla_build_tiles valhalla_build_config valhalla_build_admins valhalla_build_timezones valhalla_build_elevation valhalla_ways_to_edges valhalla_build_extract valhalla_export_edges valhalla_add_predicted_traffic" && \
mv $preserve .. && \
for f in valhalla*; do rm $f; done && \
cd .. && mv $preserve ./bin
preserve="valhalla_service valhalla_build_tiles valhalla_build_config valhalla_build_admins valhalla_build_timezones valhalla_build_elevation valhalla_ways_to_edges valhalla_build_extract valhalla_export_edges valhalla_add_predicted_traffic" && \
mv $preserve .. && \
for f in valhalla*; do rm $f; done && \
cd .. && mv $preserve ./bin

FROM ubuntu:23.04 as runner_base
MAINTAINER Nils Nolde <[email protected]>
FROM ubuntu:24.04 AS runner_base
LABEL maintainer="Nils Nolde <[email protected]>"

# install Valhalla stuff
RUN apt-get update > /dev/null && \
export DEBIAN_FRONTEND=noninteractive && \
apt-get install -y libluajit-5.1-2 \
libzmq5 libgdal-dev libczmq4 spatialite-bin libprotobuf-lite32 sudo locales wget \
libsqlite3-0 libsqlite3-mod-spatialite libcurl4 python-is-python3 osmctools \
python3.11-minimal python3-distutils curl unzip moreutils jq spatialite-bin supervisor > /dev/null
apt-get install -y libluajit-5.1-dev \
libzmq5 libgdal-dev libczmq4 spatialite-bin libprotobuf-lite32 sudo locales wget \
libsqlite3-0 libsqlite3-mod-spatialite libcurl4 python-is-python3 osmctools \
python3.12-minimal curl unzip moreutils jq spatialite-bin supervisor > /dev/null

WORKDIR /app

Expand Down
5 changes: 5 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ The default road dataset is the [OSM](openstreetmap.org) planet PBF. If availabl
- **data updater**: includes a daily OSM updater
- **asynchronous API**: graph generation is outsourced to a [`ARQ`](https://github.com/samuelcolvin/arq) worker
- **email notifications**: notifies the requesting user if the job succeeded/failed
- **logs API** read the logs for the worker, the app and the graph builder via the API

## "Quick Start"

Expand Down Expand Up @@ -87,3 +88,7 @@ The app is listening on `/api/v1/jobs` for new `POST` requests to generate some
- Zip graph tiles from disk according to the request's bounding box and put the package to `$DATA_DIR/output/<JOB_NAME>`, along with a metadata JSON
- **busy**, the current job will be put in the queue and will be processed once it reaches the queue's head
4. Send an email to the requesting user with success or failure notice (including the error message)

### Logs

The app exposes logs via the route `/api/v1/logs/{log_type}`. Available log types are `worker`, `app` and `builder`. An optional query parameter `?lines={n}` limits the output to the last `n` lines. Authentication is required.
5 changes: 1 addition & 4 deletions cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,17 +15,14 @@
from routing_packager_app import SETTINGS
from routing_packager_app.db import get_db
from routing_packager_app.api_v1.models import Job, User
from routing_packager_app.logger import AppSmtpHandler, get_smtp_details
from routing_packager_app.logger import AppSmtpHandler, get_smtp_details, LOGGER
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

consistent use of the logger throughout the app

from routing_packager_app.utils.geom_utils import wkbe_to_geom, wkbe_to_str

JOB_TIMEOUT = 60 * 60 # one hour to compress a single graph

description = "Runs the worker to update the ZIP packages."
parser = ArgumentParser(description=description)

# set up the logger basics
LOGGER = logging.getLogger("packager")


def _sort_jobs(jobs_: List[Job]):
out = list()
Expand Down
8 changes: 4 additions & 4 deletions conf/valhalla.conf
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,8 @@ autorestart=false
redirect_stderr=true
# Either log to file inside the container or
# log to PID 1 (gunicorn in this case) so docker logs will show it
# stdout_logfile=/var/log/build_loop.log
# stdout_logfile_maxbytes=1MB
stdout_logfile=/proc/1/fd/1
stdout_logfile_maxbytes=0
stdout_logfile=%(ENV_TMP_DATA_DIR)s/logs/builder.log
stdout_logfile_maxbytes=1MB
# stdout_logfile=/proc/1/fd/1
# stdout_logfile_maxbytes=0
environment=CONCURRENCY="4",DATA_DIR="/app/data",TMP_DATA_DIR="/app/tmp_data"
3 changes: 3 additions & 0 deletions gunicorn.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
from routing_packager_app.logger import LOGGING_CONFIG
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is nice: gunicorn allows us to pass a custom logging config!


bind = "0.0.0.0:5000"
workers = 1
worker_class = "uvicorn.workers.UvicornWorker"
logconfig_dict = LOGGING_CONFIG
2 changes: 1 addition & 1 deletion main.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ async def lifespan(app: FastAPI):
p.mkdir(exist_ok=True)
SETTINGS.get_output_path().mkdir(exist_ok=True)
yield
app.state.redis_pool.shutdown()
await app.state.redis_pool.shutdown()
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this was causing a warning



app: FastAPI = create_app(lifespan=lifespan)
Expand Down
33 changes: 11 additions & 22 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -44,24 +44,19 @@ coveralls = "^3.3.1"
requires = ["poetry>=1.0.0"]
build-backend = "poetry.masonry.api"

[tool.black]
line-length = 105
exclude = '''
/(
\.git
| \.venv
| dist
| build
)/
'''

[tool.ruff]

extend-exclude = [".venv", "third_party", "build"]
lint.preview = true
format.preview = true

# Enable pycodestyle (`E`) and Pyflakes (`F`) codes by default.
select = ["E", "F"]
ignore = []
lint.select = ["E", "F", "RUF022"]
lint.ignore = []
line-length = 105

# Allow autofix for all enabled rules (when `--fix`) is provided.
fixable = [
lint.fixable = [
"A",
"B",
"C",
Expand Down Expand Up @@ -106,12 +101,6 @@ fixable = [
"TRY",
"UP",
"YTT",
"RUF022",
]
unfixable = []

# Exclude a variety of commonly ignored directories.
exclude = [".venv", "__pycache__", ".git"]

# Same as Black.
line-length = 105
target-version = "py312"
lint.unfixable = []
3 changes: 2 additions & 1 deletion routing_packager_app/api_v1/__init__.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
from fastapi import APIRouter

from .routes import users, jobs
from .routes import jobs, logs, users

api_v1_router = APIRouter()
api_v1_router.include_router(jobs.router, prefix="/jobs", tags=["jobs"])
api_v1_router.include_router(users.router, prefix="/users", tags=["users"])
api_v1_router.include_router(logs.router, prefix="/logs", tags=["logs"])
11 changes: 9 additions & 2 deletions routing_packager_app/api_v1/models.py
Original file line number Diff line number Diff line change
@@ -1,12 +1,13 @@
from datetime import datetime
from typing import Optional, List
from enum import Enum
from typing import List, Optional

from fastapi.security import HTTPBasicCredentials
from geoalchemy2 import Geography
from pydantic import EmailStr
from sqlalchemy import Column
from sqlalchemy_utils import PasswordType
from sqlmodel import SQLModel, Field, DateTime, Relationship, Session, select, AutoString
from sqlmodel import AutoString, DateTime, Field, Relationship, Session, SQLModel, select
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ruff also does isort's job now


from ..config import SETTINGS
from ..constants import Providers, Statuses
Expand Down Expand Up @@ -108,3 +109,9 @@ def add_admin_user(session: Session):
admin_user = User(email=admin_email, password=admin_pass)
session.add(admin_user)
session.commit()


class LogType(str, Enum):
WORKER = "worker"
APP = "app"
BUILDER = "builder"
48 changes: 48 additions & 0 deletions routing_packager_app/api_v1/routes/logs.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
from fastapi import APIRouter, Depends, HTTPException
from fastapi.responses import PlainTextResponse
from fastapi.security import HTTPBasicCredentials
from sqlmodel import Session
from starlette.status import (
HTTP_400_BAD_REQUEST,
HTTP_401_UNAUTHORIZED,
)

from ...auth.basic_auth import BasicAuth
from ...config import SETTINGS
from ...db import get_db
from ..models import LogType, User

router = APIRouter()


@router.get("/{log_type}", response_class=PlainTextResponse)
def get_logs(
log_type: LogType,
lines: int | None = None,
db: Session = Depends(get_db),
auth: HTTPBasicCredentials = Depends(BasicAuth),
):
# first authenticate
req_user = User.get_user(db, auth)
if not req_user:
raise HTTPException(HTTP_401_UNAUTHORIZED, "Not authorized to read logs.")

# figure out the type of logs
log_file = SETTINGS.get_logging_dir() / f"{log_type.value}.log"

try:
with open(log_file) as fh:
if lines is None:
return fh.read()
line_count = len([1 for _ in fh.readlines()])
start_i = line_count - lines if line_count > lines else 0
response = ""
fh.seek(0)
for i, line in enumerate(fh.readlines()):
if i < start_i:
continue
response += line
return response

except: # noqa
return HTTP_400_BAD_REQUEST(f"Unable to open {log_file}.")
20 changes: 16 additions & 4 deletions routing_packager_app/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ class BaseSettings(_BaseSettings):

DESCRIPTION_PATH: Path = BASE_DIR.joinpath("DESCRIPTION.md")

### APP ###
# APP ###
ADMIN_EMAIL: str = "[email protected]"
ADMIN_PASS: str = "admin"
# TODO: clarify if there's a need to restrict origins
Expand All @@ -30,15 +30,15 @@ class BaseSettings(_BaseSettings):

ENABLED_PROVIDERS: list[str] = list(CommaSeparatedStrings("osm"))

### DATABASES ###
# DATABASES ###
POSTGRES_HOST: str = "localhost"
POSTGRES_PORT: int = 5432
POSTGRES_DB: str = "gis"
POSTGRES_USER: str = "admin"
POSTGRES_PASS: str = "admin"
REDIS_URL: str = "redis://localhost"

### SMTP ###
# SMTP ###
SMTP_HOST: str = "localhost"
SMTP_PORT: int = 1025
SMTP_FROM: str = "[email protected]"
Expand Down Expand Up @@ -79,6 +79,19 @@ def get_tmp_data_dir(self) -> Path:

return tmp_data_dir

def get_logging_dir(self) -> Path:
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Internally all logs are written into the tmp_data_dir logs subdirectory, so they are easily shared with the FastAPI app.

"""
Gets the path where logs are stored for both worker and builder/app
"""
tmp_data_dir = self.TMP_DATA_DIR
if os.path.isdir("/app") and not os.getenv("CI", None): # pragma: no cover
tmp_data_dir = Path("/app/tmp_data")
log_dir = tmp_data_dir / "logs"

log_dir.mkdir(exist_ok=True)

return log_dir


class ProdSettings(BaseSettings):
model_config = SettingsConfigDict(case_sensitive=True, env_file=ENV_FILE, extra="ignore")
Expand Down Expand Up @@ -108,7 +121,6 @@ class TestSettings(BaseSettings):

# decide which settings we'll use
SETTINGS: Optional[BaseSettings] = None
print("LOADING SETTINGS")
env = os.getenv("API_CONFIG", "prod")
if env == "prod": # pragma: no cover
SETTINGS = ProdSettings()
Expand Down
Loading
Loading