Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release v1.1.1 #647

Closed
wants to merge 7 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
167 changes: 112 additions & 55 deletions .github/workflows/CI.yml
Original file line number Diff line number Diff line change
Expand Up @@ -70,34 +70,32 @@ jobs:
fail-fast: false

steps:
# used to reset cache every month
- name: Get current year-month
id: date
run: echo "date=$(date +'%Y-%m')" >> $GITHUB_OUTPUT

- name: Get pip cache dir
id: pip-cache
run: |
echo "dir=$(pip cache dir)" >> $GITHUB_OUTPUT

- uses: actions/checkout@v3
- uses: actions/[email protected]

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
uses: actions/setup-python@v5.1.0
with:
python-version: ${{ matrix.python-version }}
check-latest: true
cache: 'pip'
cache-dependency-path: '**/requirements.txt'
cache-dependency-path: |
**/requirements.txt
**/requirements-extras.txt
**/requirements-tests.txt

- name: Cache test_env
uses: actions/cache@v3
- name: Get current hash (SHA) of the elephant_data repo
id: elephant-data
run: |
echo "dataset_hash=$(git ls-remote https://gin.g-node.org/NeuralEnsemble/elephant-data.git HEAD | cut -f1)" >> $GITHUB_OUTPUT

- uses: actions/cache/[email protected]
# Loading cache of elephant-data
id: cache-datasets
with:
path: ${{ steps.pip-cache.outputs.dir }}
# Look to see if there is a cache hit for the corresponding requirements files
# cache will be reset on changes to any requirements or every month
key: ${{ runner.os }}-venv-${{ hashFiles('**/requirements.txt') }}-${{ hashFiles('**/requirements-tests.txt') }}
-${{ hashFiles('**/requirements-extras.txt') }}-${{ hashFiles('**/CI.yml') }}-${{ hashFiles('setup.py') }}
-${{ steps.date.outputs.date }}
path: ~/elephant-data
key: datasets-${{ steps.elephant-data.outputs.dataset_hash }}
restore-keys: datasets-
enableCrossOsArchive: true

- name: Install dependencies
run: |
Expand All @@ -112,6 +110,11 @@ jobs:

- name: Test with pytest
run: |
if [ -d ~/elephant-data ]; then
export ELEPHANT_DATA_LOCATION=~/elephant-data
echo $ELEPHANT_DATA_LOCATION
fi

coverage run --source=elephant -m pytest
coveralls --service=github || echo "Coveralls submission failed"
env:
Expand All @@ -131,22 +134,35 @@ jobs:
fail-fast: false
matrix:
# OS [ubuntu-latest, macos-latest, windows-latest]
os: [macos-11,macos-12]
os: [macos-12,macos-13]
python-version: [3.11]
steps:
- name: Get current year-month
id: date
run: echo "date=$(date +'%Y-%m')" >> $GITHUB_OUTPUT

- uses: actions/checkout@v3
- uses: actions/checkout@v4.1.6

- name: Cache conda
uses: actions/cache@v3
with:
path: ~/conda_pkgs_dir
key: ${{ runner.os }}-conda-${{hashFiles('requirements/environment.yml') }}-${{ hashFiles('**/CI.yml') }}-${{ steps.date.outputs.date }}

- uses: conda-incubator/setup-miniconda@030178870c779d9e5e1b4e563269f3aa69b04081 # corresponds to v3.0.3
- name: Get current hash (SHA) of the elephant_data repo
id: elephant-data
run: |
echo "dataset_hash=$(git ls-remote https://gin.g-node.org/NeuralEnsemble/elephant-data.git HEAD | cut -f1)" >> $GITHUB_OUTPUT

- uses: actions/cache/[email protected]
# Loading cache of elephant-data
id: cache-datasets
with:
path: ~/elephant-data
key: datasets-${{ steps.elephant-data.outputs.dataset_hash }}
restore-keys: datasets-

- uses: conda-incubator/setup-miniconda@a4260408e20b96e80095f42ff7f1a15b27dd94ca # corresponds to v3.0.4
with:
auto-update-conda: true
python-version: ${{ matrix.python-version }}
Expand All @@ -173,6 +189,10 @@ jobs:
- name: Test with pytest
shell: bash -l {0}
run: |
if [ -d ~/elephant-data ]; then
export ELEPHANT_DATA_LOCATION=~/elephant-data
echo $ELEPHANT_DATA_LOCATION
fi
pytest --cov=elephant

# __ ___ _
Expand All @@ -192,24 +212,32 @@ jobs:
os: [windows-latest]

steps:
- name: Get current year-month
id: date
run: echo "date=$(date +'%Y-%m')" >> $GITHUB_OUTPUT

- uses: actions/checkout@v3
- uses: actions/[email protected]

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
uses: actions/setup-python@v5.1.0
with:
python-version: ${{ matrix.python-version }}
check-latest: true
cache: 'pip'
cache-dependency-path: |
**/requirements.txt
**/requirements-extras.txt
**/requirements-tests.txt

- name: Cache pip
uses: actions/cache@v3
- name: Get current hash (SHA) of the elephant_data repo
id: elephant-data
run: |
echo "dataset_hash=$(git ls-remote https://gin.g-node.org/NeuralEnsemble/elephant-data.git HEAD | cut -f1)" >> $GITHUB_OUTPUT

- uses: actions/cache/[email protected]
# Loading cache of elephant-data
id: cache-datasets
with:
path: ~\AppData\Local\pip\Cache
# Look to see if there is a cache hit for the corresponding requirements files
key: ${{ runner.os }}-pip-${{ hashFiles('**/requirements.txt') }}-${{ hashFiles('**/requirements-tests.txt') }}
-${{ hashFiles('**/requirements-extras.txt') }}-${{ hashFiles('setup.py') }} -${{ hashFiles('**/CI.yml') }}-${{ steps.date.outputs.date }}
path: ~/elephant-data
key: datasets-${{ steps.elephant-data.outputs.dataset_hash }}
restore-keys: datasets-
enableCrossOsArchive: true

- name: Install dependencies
run: |
Expand All @@ -224,6 +252,10 @@ jobs:

- name: Test with pytest
run: |
if (Test-Path "$env:USERPROFILE\elephant-data") {
$env:ELEPHANT_DATA_LOCATION = "$env:USERPROFILE\elephant-data"
Write-Output $env:ELEPHANT_DATA_LOCATION
}
pytest --cov=elephant

# __ __ ____ ___
Expand All @@ -246,29 +278,32 @@ jobs:
fail-fast: false

steps:
- name: Get current year-month
id: date
run: echo "date=$(date +'%Y-%m')" >> $GITHUB_OUTPUT
- uses: actions/checkout@v3
- uses: actions/[email protected]

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
uses: actions/setup-python@v5.1.0
with:
python-version: ${{ matrix.python-version }}
check-latest: true
cache: 'pip'
cache-dependency-path: |
**/requirements.txt
**/requirements-extras.txt
**/requirements-tests.txt

- name: Get pip cache dir
id: pip-cache
- name: Get current hash (SHA) of the elephant_data repo
id: elephant-data
run: |
echo "dir=$(pip cache dir)" >> $GITHUB_OUTPUT
echo "dataset_hash=$(git ls-remote https://gin.g-node.org/NeuralEnsemble/elephant-data.git HEAD | cut -f1)" >> $GITHUB_OUTPUT

- name: Cache test_env
uses: actions/cache@v3
- uses: actions/cache/[email protected]
# Loading cache of elephant-data
id: cache-datasets
with:
path: ${{ steps.pip-cache.outputs.dir }}
# look to see if there is a cache hit for the corresponding requirements files
# cache will be reset on changes to any requirements or every month
key: ${{ runner.os }}-venv-${{ hashFiles('**/requirements.txt') }}-${{ hashFiles('**/requirements-tests.txt') }}
-${{ hashFiles('**/requirements-extras.txt') }}-${{ hashFiles('setup.py') }} -${{ hashFiles('**/CI.yml') }}-${{ steps.date.outputs.date }}
path: ~/elephant-data
key: datasets-${{ steps.elephant-data.outputs.dataset_hash }}
restore-keys: datasets-
enableCrossOsArchive: true

- name: Setup environment
run: |
Expand All @@ -287,6 +322,10 @@ jobs:

- name: Test with pytest
run: |
if [ -d ~/elephant-data ]; then
export ELEPHANT_DATA_LOCATION=~/elephant-data
echo $ELEPHANT_DATA_LOCATION
fi
mpiexec -n 1 python -m mpi4py -m coverage run --source=elephant -m pytest
coveralls --service=github || echo "Coveralls submission failed"
env:
Expand Down Expand Up @@ -316,7 +355,7 @@ jobs:
id: date
run: echo "date=$(date +'%Y-%m')" >> $GITHUB_OUTPUT

- uses: actions/checkout@v3
- uses: actions/checkout@v4.1.6

- name: Get pip cache dir
id: pip-cache
Expand All @@ -330,6 +369,20 @@ jobs:

key: ${{ runner.os }}-pip-${{hashFiles('requirements/environment-tests.yml') }}-${{ hashFiles('**/CI.yml') }}-${{ steps.date.outputs.date }}

- name: Get current hash (SHA) of the elephant_data repo
id: elephant-data
run: |
echo "dataset_hash=$(git ls-remote https://gin.g-node.org/NeuralEnsemble/elephant-data.git HEAD | cut -f1)" >> $GITHUB_OUTPUT

- uses: actions/cache/[email protected]
# Loading cache of elephant-data
id: cache-datasets
with:
path: ~/elephant-data
key: datasets-${{ steps.elephant-data.outputs.dataset_hash }}
restore-keys: datasets-
enableCrossOsArchive: true

- uses: conda-incubator/setup-miniconda@030178870c779d9e5e1b4e563269f3aa69b04081 # corresponds to v3.0.3
with:
auto-update-conda: true
Expand Down Expand Up @@ -358,6 +411,10 @@ jobs:
- name: Test with pytest
shell: bash -el {0}
run: |
if [ -d ~/elephant-data ]; then
export ELEPHANT_DATA_LOCATION=~/elephant-data
echo $ELEPHANT_DATA_LOCATION
fi
pytest --cov=elephant

# ____
Expand All @@ -383,7 +440,7 @@ jobs:
id: date
run: echo "date=$(date +'%Y-%m')" >> $GITHUB_OUTPUT

- uses: actions/checkout@v3
- uses: actions/checkout@v4.1.6

- name: Get pip cache dir
id: pip-cache
Expand Down Expand Up @@ -448,10 +505,10 @@ jobs:
- name: Get current year-month
id: date
run: echo "::set-output name=date::$(date +'%Y-%m')"
- uses: actions/checkout@v3
- uses: actions/checkout@v4.1.6

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
uses: actions/setup-python@v5.1.0
with:
python-version: ${{ matrix.python-version }}

Expand Down
65 changes: 65 additions & 0 deletions .github/workflows/cache_elephant_data.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
name: Create caches for elephant_data

on:
workflow_dispatch: # Workflow can be triggered manually via GH actions webinterface
push: # When something is pushed into master this checks if caches need to re-created
branches:
- master
schedule:
- cron: "11 23 * * *" # Daily at 23:11 UTC


jobs:
create-data-cache-if-missing:
name: Caching data env
runs-on: ubuntu-latest
strategy:
# do not cancel all in-progress jobs if any matrix job fails
fail-fast: false

steps:
- name: Get current hash (SHA) of the elephant_data repo
id: elephant-data
run: |
echo "dataset_hash=$(git ls-remote https://gin.g-node.org/NeuralEnsemble/elephant-data.git HEAD | cut -f1)" >> $GITHUB_OUTPUT

- uses: actions/[email protected]
# Loading cache of elephant-data
id: cache-datasets
with:
path: ~/elephant-data
key: datasets-${{ steps.elephant-data.outputs.dataset_hash }}

- name: Cache found?
run: echo "Cache-hit == ${{steps.cache-datasets.outputs.cache-hit == 'true'}}"

- name: Configuring git
if: steps.cache-datasets.outputs.cache-hit != 'true'
run: |
git config --global user.email "elephant_ci@fake_mail.com"
git config --global user.name "elephant CI"
git config --global filter.annex.process "git-annex filter-process" # recommended for efficiency

- name: Install Datalad Linux
if: steps.cache-datasets.outputs.cache-hit != 'true'
run: |
python -m pip install -U pip # Official recommended way
pip install datalad-installer
datalad-installer --sudo ok git-annex --method datalad/packages
pip install datalad

- name: Download dataset
id: download-dataset
if: steps.cache-datasets.outputs.cache-hit != 'true'
# Download repository and also fetch data
run: |
cd ~
datalad --version
datalad install --recursive --get-data https://gin.g-node.org/NeuralEnsemble/elephant-data

- name: Show size of the cache to assert data is downloaded
run: |
cd ~
du -hs ~/elephant-data
ls -lh ~/elephant-data

2 changes: 1 addition & 1 deletion .zenodo.json
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
}
],

"title": "Elephant 1.1.0",
"title": "Elephant 1.1.1",

"keywords": [
"neuroscience",
Expand Down
8 changes: 4 additions & 4 deletions codemeta.json
Original file line number Diff line number Diff line change
Expand Up @@ -6,15 +6,15 @@
"contIntegration": "https://github.com/NeuralEnsemble/elephant/actions",
"dateCreated": "2022-03-14",
"datePublished": "2015-04-08",
"dateModified": "2024-03-19",
"dateModified": "2024-10-28",
"downloadUrl": "https://files.pythonhosted.org/packages/cb/b5/893fadd5505e638a4c8788bf0a2f5a211f59f45203f3dfa3919469e83ed4/elephant-1.0.0.tar.gz",
"issueTracker": "https://github.com/NeuralEnsemble/elephant/issues",
"name": "Elephant",
"version": "1.1.0",
"version": "1.1.1",
"identifier": "https://doi.org/10.5281/zenodo.1186602",
"description": "Elephant (Electrophysiology Analysis Toolkit) is an open-source, community centered library for the analysis of electrophysiological data in the Python programming language. The focus of Elephant is on generic analysis functions for spike train data and time series recordings from electrodes, such as the local field potentials (LFP) or intracellular voltages.In addition to providing a common platform for analysis code from different laboratories, the Elephant project aims to provide a consistent and homogeneous analysis framework that is built on a modular foundation. \nElephant is the direct successor to Neurotools and maintains ties to complementary projects such as OpenElectrophy and spykeviewer.",
"applicationCategory": "library",
"releaseNotes": "https://github.com/NeuralEnsemble/elephant/releases/tag/v1.1.0",
"releaseNotes": "https://github.com/NeuralEnsemble/elephant/releases/tag/v1.1.1",
"funding": "EU Grant 604102 (HBP), EU Grant 720270(HBP), EU Grant 785907(HBP), EU Grant 945539(HBP)",
"developmentStatus": "active",
"keywords": [
Expand All @@ -34,7 +34,7 @@
"MacOS"
],
"softwareRequirements": [
"https://github.com/NeuralEnsemble/elephant/tree/v1.1.0/requirements"
"https://github.com/NeuralEnsemble/elephant/tree/v1.1.1/requirements"
],
"relatedLink": [
"http://python-elephant.org",
Expand Down
Loading
Loading