Skip to content

Commit

Permalink
Migrate to ruff (#475)
Browse files Browse the repository at this point in the history
* Switched from flakeheaven to ruff as linter

* Changed docs to now use ruff instead of flakeheaven

* Skip other jobs in CI (revert once change is finished)

* Corrected linting error about not chaning exceptions (B904)

* Corrected linting error about assigning names to lambdas (E731)

* Corrected issue with true-fals-comparisons

* Corrected tests if not member (E713)

* Corrected accessing attributes without assigning them (B018)

* Readded all Tests from CI

* Corrected empty lines with whitespaces (W293)

* Removed trailing whitespaces (W291)

* Corrected comparison to None (E711)

* Forgot to add linebreaks when corrected lambdas
  • Loading branch information
jakob-fritz authored Sep 3, 2024
1 parent 453d024 commit 3b0576b
Show file tree
Hide file tree
Showing 23 changed files with 81 additions and 73 deletions.
6 changes: 3 additions & 3 deletions .github/workflows/ci_pipeline.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,13 +25,13 @@ jobs:
with:
environment-file: ${{ env.YML }}
create-args: >-
python=3.10
python=3.12
- name: Code reformatting with black
run: |
black pySDC --check --diff --color
- name: Linting with flakeheaven
- name: Linting with ruff
run: |
flakeheaven lint --benchmark pySDC
ruff check pySDC
user_cpu_tests_linux:
runs-on: ubuntu-latest
Expand Down
18 changes: 9 additions & 9 deletions docs/contrib/02_continuous_integration.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,17 +6,17 @@ Finally, the CI also build artifacts that are used to generate the documentation

## Code linting

Code style linting is performed using [black](https://black.readthedocs.io/en/stable/) and [flakeheaven](https://flakeheaven.readthedocs.io/en/latest/) for code syntax checking. In particular, `black` is used to check compliance with (most of) [PEP-8 guidelines](https://peps.python.org/pep-0008/).
Code style linting is performed using [black](https://black.readthedocs.io/en/stable/) and [ruff](https://docs.astral.sh/ruff/) for code syntax checking. In particular, `black` is used to check compliance with (most of) [PEP-8 guidelines](https://peps.python.org/pep-0008/).

Those tests are conducted for each commit (even for forks), but you can also run it locally in the root folder of `pySDC` before pushing any commit :

```bash
# Install required packages (works also with conda/mamba)
pip install black flakeheaven flake8-comprehensions flake8-bugbear
pip install black ruff
# First : test code style linting with black
black pySDC --check --diff --color
# Second : test code syntax with flakeheaven
flakeheaven lint --benchmark pySDC
# Second : test code syntax with ruff
ruff check pySDC
```

> :bell: To avoid any error about formatting (`black`), you can simply use this program to reformat your code directly using the command :
Expand Down Expand Up @@ -59,7 +59,7 @@ fi
You may need to run `chmod +x` on the file to allow it to be executed.
Be aware that the hook will alter files you may have opened in an editor whenever you make a commit, which may confuse you(r editor).

To automate flakeheaven, we want to write a hook that alters the commit message in case any errors are detected. This gives us the choice of aborting the commit and fixing the issues, or we can go ahead and commit them and worry about flakeheaven only when the time comes to do a pull request.
To automate ruff, we want to write a hook that alters the commit message in case any errors are detected. This gives us the choice of aborting the commit and fixing the issues, or we can go ahead and commit them and worry about ruff only when the time comes to do a pull request.
To obtain this functionality, add the following to `<pySDC-root-directory>/.git/hooks/prepare-commit-msg`:

```bash
Expand All @@ -71,11 +71,11 @@ export files=$(git diff --staged --name-only HEAD | grep .py | sed -e "s,^,$(git

if [[ $files != "" ]]
then
export flakeheaven_output=$(flakeheaven lint --format default $files)
if [[ "$flakeheaven_output" != 0 ]]
export ruff_output=$(ruff check --quiet $files)
if [[ "$ruff_output" != 0 ]]
then
git interpret-trailers --in-place --trailer "$(echo "$flakeheaven_output" | sed -e 's/^/#/')" "$COMMIT_MSG_FILE"
git interpret-trailers --in-place --trailer "#!!!!!!!!!! WARNING: FLAKEHEAVEN FAILED !!!!!!!!!!" "$COMMIT_MSG_FILE"
git interpret-trailers --in-place --trailer "$(echo "$ruff_output" | sed -e 's/^/#/')" "$COMMIT_MSG_FILE"
git interpret-trailers --in-place --trailer "#!!!!!!!!!! WARNING: RUFF FAILED !!!!!!!!!!" "$COMMIT_MSG_FILE"
fi
fi

Expand Down
4 changes: 1 addition & 3 deletions etc/environment-lint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,4 @@ channels:
- defaults
dependencies:
- black
- flakeheaven
- flake8-comprehensions
- flake8-bugbear
- ruff
2 changes: 1 addition & 1 deletion pySDC/core/collocation.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ def __init__(self, num_nodes=None, tleft=0, tright=1, node_type='LEGENDRE', quad
nNodes=num_nodes, nodeType=node_type, quadType=quad_type, tLeft=tleft, tRight=tright
)
except Exception as e:
raise CollocationError(f"could not instantiate qmat generator, got error: {e}")
raise CollocationError(f"could not instantiate qmat generator, got error: {e}") from e

# Set base attributes
self.num_nodes = num_nodes
Expand Down
4 changes: 2 additions & 2 deletions pySDC/core/common.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,8 +56,8 @@ def _makeAttributeAndRegister(self, *names, localVars=None, readOnly=False):
for name in names:
try:
super().__setattr__(name, localVars[name])
except KeyError: # pragma: no cover
raise ValueError(f'value for {name} not given in localVars')
except KeyError as e: # pragma: no cover
raise ValueError(f'value for {name} not given in localVars') from e
# Register as class parameter
if readOnly:
self._parNamesReadOnly = self._parNamesReadOnly.union(names)
Expand Down
2 changes: 1 addition & 1 deletion pySDC/core/sweeper.py
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,7 @@ def setupGenerator(self, qd_type):
tLeft=coll.tleft,
)
except Exception as e:
raise ValueError(f"could not generate {qd_type=!r} with qmat, got error : {e}")
raise ValueError(f"could not generate {qd_type=!r} with qmat, got error : {e}") from e

def get_Qdelta_implicit(self, qd_type, k=None):
QDmat = np.zeros_like(self.coll.Qmat)
Expand Down
14 changes: 10 additions & 4 deletions pySDC/implementations/hooks/log_solution.py
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ class LogToFile(Hooks):
index associated with individual files by giving a different lambda function ``format_index`` class attribute. This
lambda should accept one index and return one string.
You can also give a custom ``logging_condition`` lambda, accepting the current level if you want to log selectively.
You can also give a custom ``logging_condition`` function, accepting the current level if you want to log selectively.
Importantly, you may need to change ``process_solution``. By default, this will return a numpy view of the solution.
Of course, if you are not using numpy, you need to change this. Again, this is a lambda accepting the level.
Expand All @@ -99,9 +99,15 @@ class LogToFile(Hooks):

path = None
file_name = 'solution'
logging_condition = lambda L: True
process_solution = lambda L: {'t': L.time + L.dt, 'u': L.uend.view(np.ndarray)}
format_index = lambda index: f'{index:06d}'

def logging_condition(L):
return True

def process_solution(L):
return {'t': L.time + L.dt, 'u': L.uend.view(np.ndarray)}

def format_index(index):
return f'{index:06d}'

def __init__(self):
super().__init__()
Expand Down
4 changes: 2 additions & 2 deletions pySDC/implementations/problem_classes/Battery.py
Original file line number Diff line number Diff line change
Expand Up @@ -163,7 +163,7 @@ def eval_f(self, u, t):
else:
# proof all switching conditions and find largest index where it drops below V_ref
switch = [True if u[k] <= self.V_ref[k - 1] else False for k in range(1, len(u))]
max_index = max([k if switch[k] == True else -1 for k in range(len(switch))])
max_index = max([k if switch[k] else -1 for k in range(len(switch))])

if max_index == -1:
f.expl[:] = self.switch_f[0]
Expand Down Expand Up @@ -200,7 +200,7 @@ def solve_system(self, rhs, factor, u0, t):
else:
# proof all switching conditions and find largest index where it drops below V_ref
switch = [True if rhs[k] <= self.V_ref[k - 1] else False for k in range(1, len(rhs))]
max_index = max([k if switch[k] == True else -1 for k in range(len(switch))])
max_index = max([k if switch[k] else -1 for k in range(len(switch))])
if max_index == -1:
self.A = self.switch_A[0]

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -86,9 +86,9 @@ def __init__(
"""Initialization routine"""

# make sure parameters have the correct types
if not type(nvars) in [int, tuple]:
if type(nvars) not in [int, tuple]:
raise ProblemError('nvars should be either tuple or int')
if not type(freq) in [int, tuple]:
if type(freq) not in [int, tuple]:
raise ProblemError('freq should be either tuple or int')

# transforms nvars into a tuple
Expand Down
2 changes: 1 addition & 1 deletion pySDC/implementations/problem_classes/PenningTrap_3D.py
Original file line number Diff line number Diff line change
Expand Up @@ -172,7 +172,7 @@ def eval_f(self, part, t):

self.work_counters['rhs']()
try:
penningtrap.Harmonic_oscillator
_unused = penningtrap.Harmonic_oscillator
Emat = np.diag([0, 0, -1])
except AttributeError:
Emat = np.diag([1, 1, -2])
Expand Down
4 changes: 2 additions & 2 deletions pySDC/implementations/problem_classes/generic_ND_FD.py
Original file line number Diff line number Diff line change
Expand Up @@ -95,9 +95,9 @@ def __init__(
bcParams=None,
):
# make sure parameters have the correct types
if not type(nvars) in [int, tuple]:
if type(nvars) not in [int, tuple]:
raise ProblemError('nvars should be either tuple or int')
if not type(freq) in [int, tuple]:
if type(freq) not in [int, tuple]:
raise ProblemError('freq should be either tuple or int')

# transforms nvars into a tuple
Expand Down
2 changes: 1 addition & 1 deletion pySDC/projects/DAE/sweepers/rungeKuttaDAE.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ class RungeKuttaDAE(RungeKutta):
.. math::
0 = F(u, u', t).
RK methods for general DAEs have the form
.. math::
Expand Down
2 changes: 1 addition & 1 deletion pySDC/projects/DAE/sweepers/semiImplicitDAE.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ class SemiImplicitDAE(FullyImplicitDAE):
0 = g(\vec{U}_0 + \Delta t (\mathbf{Q} \otimes \mathbf{I}_{n_d}) \vec{U}, \vec{z}, \tau),
where
- :math:`\tau=(\tau_1,..,\tau_M) in \mathbb{R}^M` the vector of collocation nodes,
- :math:`\vec{U}_0 = (u_0,..,u_0) \in \mathbb{R}^{MN_d}` the vector of initial condition spread to each node,
- spectral integration matrix :math:`\mathbf{Q} \in \mathbb{R}^{M \times M}`,
Expand Down
4 changes: 2 additions & 2 deletions pySDC/projects/Monodomain/problem_classes/TestODE.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@
from pySDC.projects.Monodomain.datatype_classes.my_mesh import imexexp_mesh


"""
"""
Here we define the problems classes for the multirate Dahlquist test equation y'=lambda_I*y + lambda_E*y + lambda_e*y
Things are done so that it is compatible witht the sweepers.
Things are done so that it is compatible witht the sweepers.
"""


Expand Down
2 changes: 1 addition & 1 deletion pySDC/projects/Monodomain/run_scripts/run_TestODE.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
)
from pySDC.projects.Monodomain.sweeper_classes.runge_kutta.imexexp_1st_order import imexexp_1st_order

"""
"""
Run the multirate Dahlquist test equation and plot the stability domain of the method.
We vary only the exponential term and the stiff term, while the non stiff term is kept constant (to allow 2D plots).
"""
Expand Down
4 changes: 2 additions & 2 deletions pySDC/projects/PinTSimE/hardcoded_solutions.py
Original file line number Diff line number Diff line change
Expand Up @@ -475,9 +475,9 @@ def testSolution(u_num, prob_cls_name, dt, use_adaptivity, use_detection):
if key == 't_switches' or key == 'e_event':
err_msg = f'{msg} Expected {key}={expected[key]}, got {key}={got[key]}'
if len(expected[key]) == got[key]:
assert np.allclose(expected[key], got[key], atol=1e-4) == True, err_msg
assert np.allclose(expected[key], got[key], atol=1e-4), err_msg
else:
assert np.isclose(expected[key][-1], got[key][-1], atol=1e-4) == True, err_msg
assert np.isclose(expected[key][-1], got[key][-1], atol=1e-4), err_msg
else:
err_msg = f'{msg} Expected {key}={expected[key]:.4e}, got {key}={got[key]:.4e}'
assert np.isclose(expected[key], got[key], atol=1e-4), err_msg
4 changes: 3 additions & 1 deletion pySDC/projects/PinTSimE/switch_estimator.py
Original file line number Diff line number Diff line change
Expand Up @@ -297,7 +297,9 @@ def get_switch(t_interp, state_function, m_guess):
"""

LagrangeInterpolation = LagrangeApproximation(points=t_interp, fValues=state_function)
p = lambda t: LagrangeInterpolation.__call__(t)

def p(t):
return LagrangeInterpolation.__call__(t)

def fprime(t):
r"""
Expand Down
12 changes: 6 additions & 6 deletions pySDC/projects/Resilience/ResilienceStatistics.ipynb

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion pySDC/projects/Resilience/fault_injection.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ def __init__(self, params=None):
for k, v in params.items():
setattr(self, k, v)

self._freeze
self._freeze()

@classmethod
def random(cls, args, rnd_params, random_generator=None):
Expand Down
2 changes: 1 addition & 1 deletion pySDC/projects/Second_orderSDC/penningtrap_HookClass.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ def post_step(self, step, level_number):
#

try: # pragma: no cover
L.prob.Harmonic_oscillator
_unused = L.prob.Harmonic_oscillator

# add up kinetic and potntial contributions to total energy
epot = 0
Expand Down
2 changes: 1 addition & 1 deletion pySDC/projects/Second_orderSDC/plot_helper.py
Original file line number Diff line number Diff line change
Expand Up @@ -314,7 +314,7 @@ def organize_data(self, filename='data/dt_vs_local_errorSDC.csv', time_iter=None
filename (string): data to find approximate order
time_iter : in case it you used different time iterations
"""
if time_iter == None:
if time_iter is None:
time_iter = self.time_iter

items = np.genfromtxt(filename, delimiter=',', skip_header=1)
Expand Down
4 changes: 2 additions & 2 deletions pySDC/tutorial/step_6/C_MPI_parallelization.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,8 @@ def main(cwd):
import mpi4py

del mpi4py
except ImportError:
raise ImportError('petsc tests need mpi4py')
except ImportError as e:
raise ImportError('petsc tests need mpi4py') from e

# Set python path once
my_env = os.environ.copy()
Expand Down
50 changes: 26 additions & 24 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -47,9 +47,7 @@ apps = [
'mpi4py-fft>=2.0.2'
]
dev = [
'flakeheaven',
'flake8-comprehensions',
'flake8-bugbear',
'ruff',
'pytest',
'pytest-cov',
'sphinx'
Expand All @@ -70,31 +68,35 @@ markers = [
]
timeout = 300

[tool.flakeheaven]
max-line-length = 120
per-file-ignores = [
'pySDC/tutorial/step_6/C_MPI_parallelization.py:F401',
'pySDC/projects/Hamiltonian/solar_system.py:F401'
]
exclude = [
[tool.ruff]
line-length = 120
extend-exclude = [
'playgrounds',
'tests',
'*/data/*'
]
count = true
show-source = true
statistics = true

# list of plugins and rules for them
[tool.flakeheaven.plugins]
# include everything in pyflakes except F401
pyflakes = [
'+C*', '+E*', '+F*', '+W*', '+B*', '+B9*',
'-E203', '-E741', '-E402', '-W504', '-W605', '-F401'
]
#flake8-black = ["+*"]
flake8-bugbear = ["+*", '-B023', '-B028']
flake8-comprehensions = ["+*", '-C408', '-C417']

[tool.ruff.lint]
select = ["C", "E", "F", "W", "B"]
ignore = ["E203", # Whitespace before punctuation
"E741", # Ambiguous variable name
"E402", # Module level import not at top of cell
"W605", # Invalid escape sequence
"F401", # unused import
"B023", # function uses loop variable
"B028", # No excplicit stackvariable
"C408", # Unnecessary collection call
"C417", # Unnecessary map usage
# Newly added
"C901", # Complex name
"E501", # Line length, as enforced by black, but black ignores comments
"E721" # Type comparison
]
# W504 is not supported by ruff and does not need to be excluded

[tool.ruff.lint.per-file-ignores]
"pySDC/tutorial/step_6/C_MPI_parallelization.py" = ["F401"]
"pySDC/projects/Hamiltonian/solar_system.py" = ["F401"]

[tool.black]
line-length = 120
Expand Down

0 comments on commit 3b0576b

Please sign in to comment.