Skip to content

WIP: Clean up the s3 upload workflow #151

WIP: Clean up the s3 upload workflow

WIP: Clean up the s3 upload workflow #151

Workflow file for this run

# This workflow tests the model on the main branch
# TODO(@dhanshree): Test ersilia in all supported Python versions
name: Model test on push
on:
push:
branches: ["main"]
workflow_dispatch:
jobs:
test:
if: github.repository != 'ersilia-os/eos-template'
runs-on: ubuntu-latest
steps:
# This might stop working in the future, so we need to keep an eye on it
- name: Free Disk Space (Ubuntu)
uses: jlumbroso/free-disk-space@main
with:
# this might remove tools that are actually needed,
# if set to "true" but frees about 6 GB
tool-cache: true
# all of these default to true, but feel free to set to
# "false" if necessary for your workflow
android: true
dotnet: true
haskell: true
large-packages: true
swap-storage: true
- uses: actions/[email protected]
with:
lfs: true
- uses: conda-incubator/setup-miniconda@v3
with:
auto-update-conda: true
python-version: "3.10.10"
- name: Install dependencies
run: |
conda install git-lfs -c conda-forge
git-lfs install
conda install gh -c conda-forge
python -m pip install 'ersilia[test]'
- name: Update metadata to AirTable
id: update-metadata-to-airtable
env:
USER_NAME: ${{ github.repository_owner }}
BRANCH: "main"
REPO_NAME: ${{ github.event.repository.name }}
AIRTABLE_API_KEY: ${{ secrets.AIRTABLE_API_KEY }}
uses: nick-fields/retry@v3
with:
timeout_minutes: 10
max_attempts: 3
command: |
source activate
pip install requests pyairtable
echo "Updating metadata to AirTable looking at owner: $USER_NAME"
wget https://raw.githubusercontent.com/ersilia-os/ersilia/master/.github/scripts/airtableops.py
python3 airtableops.py airtable-update --user $USER_NAME --repo $REPO_NAME --branch $BRANCH --api-key $AIRTABLE_API_KEY
rm airtableops.py
- name: sync metadata to S3 JSON
id: sync-metadata-to-s3
env:
AIRTABLE_API_KEY: ${{ secrets.AIRTABLE_API_KEY }}
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
run: |
source activate
wget https://raw.githubusercontent.com/ersilia-os/ersilia/master/.github/scripts/convert_airtable_to_json.py
pip install boto3 requests pyairtable
python convert_airtable_to_json.py $AIRTABLE_API_KEY $AWS_ACCESS_KEY_ID $AWS_SECRET_ACCESS_KEY
rm convert_airtable_to_json.py
- name: Update README file
id: update-readme-file
env:
MODEL_ID: ${{ github.event.repository.name }}
run: |
source activate
echo "Updating README file with AirTable metadata for model: $MODEL_ID"
wget https://raw.githubusercontent.com/ersilia-os/ersilia/master/.github/scripts/airtableops.py
python3 airtableops.py readme-update --repo $MODEL_ID --path .
rm airtableops.py
less README.md
- name: Commit and push changes done to the README file
uses: actions-js/push@156f2b10c3aa000c44dbe75ea7018f32ae999772 # [email protected]
with:
author_name: "ersilia-bot"
author_email: "[email protected]"
message: "updating readme [skip ci]"
repository: "ersilia-os/${{ github.event.repository.name }}"
github_token: ${{ secrets.GITHUB_TOKEN }}
amend: true
force: true
- name: Test output
run: |
output=$(python .github/scripts/verify_model_outcome.py output.csv)
if echo "$output" | grep -q "All outcomes are null"; then
echo "Error in model outcome, aborting test"
exit 1
fi
# TODO(@dhanshree) We can potentially restore the retries.
# TODO(@dhanshree) Correct the test command as needed
- name: Test model
env:
MODEL_ID: ${{ github.event.repository.name }}
run: |
ersilia -v test $MODEL_ID -d $MODEL_ID --inspect --remote -l deep > test.log
# Upload EOS logs and test logs
- name: Upload log output
if: always()
uses: actions/upload-artifact@83fd05a356d7e2593de66fc9913b3002723633cb #pin v3.1.1
with:
name: debug-logs
retention-days: 14
path: |
/home/runner/eos/*.log

Check failure on line 125 in .github/workflows/test-model.yml

View workflow run for this annotation

GitHub Actions / .github/workflows/test-model.yml

Invalid workflow file

You have an error in your yaml syntax on line 125
./*.log