Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update CI and tests #84

Merged
merged 25 commits into from
Dec 10, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
2c7799c
Adjust unit defs in .historical.legacy for pint 0.22
khaeru Dec 10, 2023
d6da9ca
Import explicitly from sdmx.model.v21
khaeru Dec 10, 2023
ed462cf
Correct typo "O" (letter) → "0" (digit) in units
khaeru Dec 10, 2023
e08ebb1
Update SDMX usage for ≥ 2.10
khaeru Dec 10, 2023
3bbf2c7
Silence numpy warnings in .historical.T012
khaeru Dec 10, 2023
6265a09
Xfail two tests of OpenKAPSARC
khaeru Dec 10, 2023
eaa6186
Remove deprecated pd.DataFrame.append() in .T003
khaeru Dec 10, 2023
7c0f4ab
Transfer project into to pyproject.toml
khaeru Dec 10, 2023
5ec7860
Adjust for mypy --no-implicit-optional
khaeru Dec 10, 2023
014d654
Satisfy mypy in .structure.base
khaeru Dec 10, 2023
a93477a
Remove requests-cache from dependencies
khaeru Dec 10, 2023
7f3d570
Add pre-commit configuration
khaeru Dec 10, 2023
9891e96
Replace "lint" CI workflow with pre-commit job
khaeru Dec 10, 2023
6f2fcf0
Gitignore .{benchmarks,ruff_cache}
khaeru Dec 10, 2023
e6741fb
Configure ruff
khaeru Dec 10, 2023
700748e
Use reusable "publish" action from iiasa/actions
khaeru Dec 10, 2023
e379f04
Update "pytest" CI workflow
khaeru Dec 10, 2023
3da4240
Mark support for Python 3.8 through 3.12
khaeru Dec 10, 2023
26daae7
Correct coverage config in pyproject.toml
khaeru Dec 10, 2023
311f4b5
Split diagnostics to a separate CI job
khaeru Dec 10, 2023
c51e166
Map "Turkey" → TUR in COUNTRY_NAME
khaeru Dec 10, 2023
c07f38d
Update test_historical.test_coverage()
khaeru Dec 10, 2023
906523d
Xfail two tests depending on discontinued OpenKAPSARC data flows
khaeru Dec 10, 2023
30f7e58
Temporarily disable "diagnostics" CI job
khaeru Dec 10, 2023
40e254a
Expand CLI tests
khaeru Dec 10, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
76 changes: 0 additions & 76 deletions .github/workflows/lint.yaml

This file was deleted.

41 changes: 3 additions & 38 deletions .github/workflows/publish.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,41 +13,6 @@ on:

jobs:
publish:
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v2
with:
submodules: true

- uses: actions/setup-python@v2

- name: Cache Python packages
uses: actions/cache@v2
with:
path: |
~/.cache/pip
key: publish-${{ runner.os }}

- name: Upgrade pip, wheel, setuptools-scm
run: python -m pip install --upgrade pip wheel setuptools-scm twine

- name: Build package
run: |
python3 setup.py bdist_wheel sdist
twine check dist/*

- name: Publish to TestPyPI
uses: pypa/[email protected]
if: github.event_name == 'push' && startsWith(github.ref, 'refs/tags')
with:
user: __token__
password: ${{ secrets.TESTPYPI_TOKEN }}
repository_url: https://test.pypi.org/legacy/

- name: Publish to PyPI
uses: pypa/[email protected]
if: github.event_name == 'release'
with:
user: __token__
password: ${{ secrets.PYPI_TOKEN }}
uses: iiasa/actions/.github/workflows/publish.yaml@main
secrets:
PYPI_TOKEN: ${{ secrets.PYPI_TOKEN }}
132 changes: 80 additions & 52 deletions .github/workflows/pytest.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: Test suite & diagnostics
name: Test

on:
push:
Expand All @@ -8,34 +8,37 @@ on:
schedule:
- cron: "0 5 * * *"

# Cancel previous runs that have not completed
concurrency:
group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
cancel-in-progress: true

env:
# Location of RCLONE configuration file
RCLONE_CONFIG: ci/rclone.conf
# Path & URL fragments for uploaded historical data & diagnostics
gcs_bucket: gcs:data.transportenergy.org/historical/ci/
gcs_url: https://storage.googleapis.com/data.transportenergy.org/historical/ci/
# True if the event is a pull request and the incoming branch is within the
# transportenergy/database repo (as opposed to a fork). Only under this
# condition is the GCS_SERVICE_ACCOUNT_* secret available.
pr_from_main_repo: github.event_name != 'pull_request' || startsWith(github.event.pull_request.head.label, 'transportenergy:')

jobs:
pytest:
strategy:
matrix:
include:
- os: ubuntu-latest
python-version: "3.7"
run-diagnostics: false
- os: ubuntu-latest
python-version: "3.8"
run-diagnostics: false
- os: ubuntu-latest
python-version: "3.9"
run-diagnostics: true
- os: windows-latest
python-version: "3.9"
run-diagnostics: false
os:
- macos-latest
- ubuntu-latest
- windows-latest

python-version:
- "3.8"
- "3.9"
- "3.10"
- "3.11"
- "3.12"

# TEMPORARY Never run diagnostics
run-diagnostics:
- false

fail-fast: false

Expand All @@ -44,67 +47,92 @@ jobs:
name: ${{ matrix.os }}-py${{ matrix.python-version }}

steps:
- name: Cancel previous runs that have not completed
uses: styfle/[email protected]
with:
access_token: ${{ github.token }}

- uses: actions/checkout@v2
- uses: actions/checkout@v3
with:
submodules: true

- uses: actions/setup-python@v2
- uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: "**/pyproject.toml"

- name: Cache Python packages
uses: actions/cache@v2
with:
path: |
~/.cache/pip
~/.cache/rclone*.zip
~/appdata/local/pip/cache
key: ${{ matrix.os }}-${{ matrix.python-version }}
- name: Upgrade pip
run: python -m pip install --upgrade pip

- name: Upgrade pip, wheel
run: python -m pip install --upgrade pip wheel

- name: Install the Python package and dependencies
run: pip install --editable .[tests]
- name: Install the Python package and its dependencies
run: pip install .[tests]

- name: Run pytest
env:
OK_API_KEY: ${{ secrets.OPENKAPSARC_API_KEY }}
run: pytest --color=yes --cov-report=xml --verbose item

- name: Upload test coverage to Codecov.io
uses: codecov/codecov-action@v1
uses: codecov/codecov-action@v3

diagnostics:
# Temporarily disabled to merge #84
if: false
# True if the event is a pull request and the incoming branch is within the
# transportenergy/database repo (as opposed to a fork). Only under this
# condition is the GCS_SERVICE_ACCOUNT_* secret available.
# if: github.event_name != 'pull_request' || startsWith(github.event.pull_request.head.label, 'transportenergy:')

needs: [pytest]
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v3
with:
submodules: true

- uses: actions/setup-python@v4
with: # Should be the same as the latest supported version, above
python-version: "3.12"
cache: pip
cache-dependency-path: "**/pyproject.toml"

- name: Upgrade pip
run: python -m pip install --upgrade pip

- name: Install rclone
if: env.pr_from_main_repo && matrix.run-diagnostics
- name: Install the Python package and its dependencies
run: pip install .[tests]

- name: Set up Rclone
uses: AnimMouse/setup-rclone@v1

- name: Create diagnostics and upload to Google Cloud Storage
env:
service_account_json: ${{ secrets.GCS_SERVICE_ACCOUNT_1 }}
run: |
mkdir -p $HOME/.cache
pushd $HOME/.cache
curl -O https://downloads.rclone.org/rclone-current-linux-amd64.zip
popd
unzip $HOME/.cache/rclone-current-linux-amd64.zip
ls -d rclone-v* > $GITHUB_PATH
echo "$service_account_json" >ci/service-account-key.json

- name: Create diagnostics and upload to Google Cloud Storage
if: env.pr_from_main_repo && matrix.run-diagnostics
run: |
item historical diagnostics output/
rclone --progress copy output ${{ env.gcs_bucket }}${{ github.run_id }}/

- uses: LouisBrunner/[email protected]
if: env.pr_from_main_repo && matrix.run-diagnostics
with:
token: ${{ secrets.GITHUB_TOKEN }}
name: Upload historical database & diagnostics
conclusion: success
details_url: ${{ env.gcs_url }}${{ github.run_id }}/index.html
output: |
{"summary": "${{ env.gcs_url }}${{ github.run_id }}/index.html"}

pre-commit:
name: Code quality

runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: 3.x

- name: Force recreation of pre-commit virtual environment for mypy
if: github.event_name == 'schedule' # Comment this line to run on a PR
run: gh cache list -L 999 | cut -f2 | grep pre-commit | xargs -I{} gh cache delete "{}" || true
env: { GH_TOKEN: "${{ github.token }}" }

- uses: pre-commit/[email protected]
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -15,10 +15,12 @@ ipython_config.py
*.log
*.pdf

.benchmarks
.cache
.coverage*
.mypy_cache
.pytest_cache
.ruff_cache
__pycache__
build
coverage.xml
Expand Down
27 changes: 27 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
repos:
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.7.0
hooks:
- id: mypy
additional_dependencies:
- click
- iam-units
- Jinja2
- lxml-stubs
- pandas-stubs
- Pint
- pytest
- sdmx1
- traitlets
- types-PyYAML
- types-python-dateutil
- types-requests
- types-setuptools
- xarray
args: []
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.1.2
hooks:
- id: ruff
- id: ruff-format
args: [ --check ]
2 changes: 1 addition & 1 deletion item/data
Submodule data updated 1 files
+4 −8 historical/sources.yaml
13 changes: 9 additions & 4 deletions item/historical/T003.py
Original file line number Diff line number Diff line change
Expand Up @@ -89,8 +89,8 @@ def lookup(value):

df = pd.concat([df, df["Variable"].apply(lookup)], axis=1)

return (
# Compute partial sums that exclude pipelines
# Compute partial sums that exclude pipelines
df0 = (
# Select only the subset of variables, then group by Country and TIME_PERIOD
df[df["Variable"].isin(PARTIAL)]
.groupby(["Country", "TIME_PERIOD"])
Expand All @@ -101,8 +101,13 @@ def lookup(value):
.reset_index()
# Assign other dimensions for this sum
.assign(mode="Inland ex. pipeline")
# Concatenate with the original data
.append(df, ignore_index=True)
)

# - Concatenate with the original data.
# - Fill "operator" and "vehicle" key values.
# - Sort.
return (
pd.concat([df, df0], ignore_index=True)
.fillna({"operator": "_T", "vehicle": "_T"})
.sort_values(by=["Country", "TIME_PERIOD", "mode", "vehicle"])
)
4 changes: 1 addition & 3 deletions item/historical/T012.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,4 @@
"""Data cleaning code and configuration for T012."""
import numpy as np

from item.util import convert_units, dropna_logged

#: iTEM data flow matching the data from this source.
Expand Down Expand Up @@ -54,7 +52,7 @@ def process(df):
Value=lambda df_: df_["Value"]
.str.replace(" ", "")
.replace("...", "NaN")
.astype(np.float)
.astype(float)
)
.pipe(dropna_logged, "Value", [COLUMNS["country_name"]])
.pipe(convert_units, "kpassenger", "Mpassenger")
Expand Down
Loading