Skip to content

Commit

Permalink
Bump version to 0.1.0a0.dev8 (#30)
Browse files Browse the repository at this point in the history
  • Loading branch information
fcogidi authored Nov 4, 2024
1 parent f1abf92 commit dbce6da
Show file tree
Hide file tree
Showing 4 changed files with 160 additions and 158 deletions.
8 changes: 4 additions & 4 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -16,13 +16,13 @@ repos:
- id: check-toml

- repo: https://github.com/python-poetry/poetry
rev: 1.8.3
rev: 1.8.4
hooks:
- id: poetry-check
args: [--lock]

- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.6.9
rev: v0.7.2
hooks:
- id: ruff
args: [--fix, --exit-non-zero-on-fix]
Expand All @@ -31,13 +31,13 @@ repos:
types_or: [ python, pyi, jupyter ]

- repo: https://github.com/crate-ci/typos
rev: v1.26.0
rev: v1.27.0
hooks:
- id: typos
args: []

- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.11.2
rev: v1.13.0
hooks:
- id: mypy
entry: mypy
Expand Down
2 changes: 1 addition & 1 deletion mmlearn/modules/metrics/retrieval_recall.py
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,7 @@ def _is_distributed(self) -> bool:
if self.distributed_available_fn is not None:
distributed_available = self.distributed_available_fn

return distributed_available() if callable(distributed_available) else False # type: ignore[no-any-return]
return distributed_available() if callable(distributed_available) else False

def update(self, x: torch.Tensor, y: torch.Tensor, indexes: torch.Tensor) -> None:
"""Check shape, convert dtypes and add to accumulators.
Expand Down
Loading

0 comments on commit dbce6da

Please sign in to comment.