Skip to content

Commit

Permalink
Merge pull request #294 from libAtoms/espresso_remote_job_test
Browse files Browse the repository at this point in the history
Add remote job test with an Espresso calculator evaluation
  • Loading branch information
bernstei authored Mar 6, 2024
2 parents bf887a6 + c657a9f commit ffaddfa
Show file tree
Hide file tree
Showing 10 changed files with 228 additions and 42 deletions.
63 changes: 37 additions & 26 deletions .github/workflows/pytests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,36 +14,40 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [ 3.8 ]
python-version: [ "3.9" ]
max-parallel: 5
env:
coverage-on-version: 3.8
coverage-on-version: "3.9"
use-mpi: True
defaults:
run:
shell: bash -l {0}

steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
- uses: actions/checkout@v4

- name: Set up python via conda
uses: conda-incubator/setup-miniconda@v3
with:
auto-update-conda: true
python-version: ${{ matrix.python-version }}

- name: Add conda to system path
- name: Check python version
run: |
# $CONDA is an environment variable pointing to the root of the miniconda directory
echo $CONDA/bin >> $GITHUB_PATH
# - name: Install Dependencies from Conda
# run: conda env update --file=devtools/conda-envs/environment.yml --name=base
which python3
python3 --version
- name: Install pip from Conda
run: conda install pip
- name: Install dependencies from pip
run: python3 -m pip install wheel setuptools numpy scipy click matplotlib pyyaml spglib rdkit flake8 pytest pytest-cov requests

- name: Install dependencies from pip (some will already be taken care of by conda's phonop3py and its dependencies)
run: pip install wheel setuptools numpy scipy click matplotlib pandas pyyaml spglib rdkit-pypi flake8 pytest pytest-cov
- name: Check numpy
run: |
python3 -m pip list | grep numpy
python3 -c "import numpy; print(numpy.__file__, numpy.__version__)"
- name: Install latest ASE from gitlab
run: |
pip install git+https://gitlab.com/ase/ase.git
python3 -m pip install git+https://gitlab.com/ase/ase.git
echo -n "ASE VERSION "
python3 -c "import ase; print(ase.__file__, ase.__version__)"
Expand Down Expand Up @@ -87,12 +91,20 @@ jobs:
cd ..
- name: Install Quippy from PyPI
run: pip install quippy-ase
run: python3 -m pip install quippy-ase

- name: Install xTB (before things that need pandas like MACE and wfl, since it will break pandas-numpy compatibility by downgrading numpy)
run: |
conda install -c conda-forge xtb-python
python3 -m pip install typing-extensions
# install pandas now to encourage compatible numpy version after conda regressed it
python3 -m pip install pandas
- name: MACE
run: |
echo "search for torch version"
torch_version=$( pip3 install torch== 2>&1 | fgrep 'from versions' | sed -e 's/.* //' -e 's/)//' )
set +o pipefail
torch_version=$( python3 -m pip install torch== 2>&1 | fgrep 'from versions' | sed -e 's/.* //' -e 's/)//' )
echo "found torch version $torch_version, installing cpu-only variant"
python3 -m pip install torch==${torch_version}+cpu -f https://download.pytorch.org/whl/torch_stable.html
echo "installing mace"
Expand All @@ -101,15 +113,15 @@ jobs:
- name: Julia and ace fit
run: |
pip install pip install threadpoolctl
python3 -m pip install pip install threadpoolctl
wget https://julialang-s3.julialang.org/bin/linux/x64/1.8/julia-1.8.1-linux-x86_64.tar.gz
tar xzf julia-1.8.1-linux-x86_64.tar.gz
# note that this hardwires a particular compatible ACE1pack version
echo 'using Pkg; pkg"registry add https://github.com/JuliaRegistries/General"; pkg"registry add https://github.com/JuliaMolSim/MolSim.git"; pkg"add [email protected], ACE1, JuLIP, IPFitting, ASE"' > ace1pack_install.jl
${PWD}/julia-1.8.1/bin/julia ace1pack_install.jl
- name: Install wfl (expyre and universalSOAP are dependencies)
run: pip install .
run: python3 -m pip install .

- name: Install Quantum Espresso
run: |
Expand Down Expand Up @@ -138,17 +150,12 @@ jobs:
run: |
echo $HOME/bin >> $GITHUB_PATH
- name: Install xTB
run: |
conda install -c conda-forge xtb-python
pip install typing-extensions
- name: Install MPI dependencies
if: env.use-mpi
run: |
# this can eaily be turned off if needed
conda install -c conda-forge mpi4py openmpi pytest-mpi
pip install mpipool
python3 -m pip install mpipool
- name: Install and configure slurm and ExPyRe
run: |
Expand Down Expand Up @@ -203,6 +210,10 @@ jobs:
- name: Test with pytest - coverage
if: env.coverage-on-version == matrix.python-version
run: |
echo "BOB pre actual pytest"
which python3
python3 -m pip list | grep numpy
python3 -c "import numpy; print(numpy.__file__, numpy.__version__)"
rm -rf $HOME/pytest_cov
mkdir $HOME/pytest_cov
#
Expand Down
20 changes: 20 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,29 @@ The main functions of Workflow is to efficiently parallelise operations over a s

For examples and more information see [documentation](https://libatoms.github.io/workflow/)

NOTE: because of the very large time intervals between official ASE releases, `wfl` is typically
set up for (and tested against) the latest ASE gitlab repo `master` branch. Recent changes
that require this support include variable cell minimization using `FrechetCellFilter` and
`Espresso` calculator configuration. See documentation link above for installation instructions.


# Recent changes

v0.2.3:

- Add wfl.generate.neb, with required improved support for passing ConfigSet.groups() to
autoaparallelized functions

- Improved handling of old and new style ase.calculators.espresso.Espresso initialization

v0.2.2:

- Improve checking of DFT calculator convergence

v0.2.1:

- Fix group iterator

v0.2.0:

- Change all wfl operations to use explicit random number generator [pull 285](https://github.com/libAtoms/workflow/pull/285), to improve reproducibility of scripts and reduce the chances that on script rerun, cached jobs will not be recognized due to uncontrolled change in random seed (as in [issue 283](https://github.com/libAtoms/workflow/issues/283) and [issue 284](https://github.com/libAtoms/workflow/issues/284)). Note that this change breaks backward compatibility because many functions now _require_ an `rng` argument, for example
Expand Down
2 changes: 1 addition & 1 deletion complete_pytest.tin
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ export ASE_VASP_COMMAND_GAMMA=vasp.gamma.serial
export PYTEST_VASP_POTCAR_DIR=$VASP_PATH/pot/rev_54/PBE
# QE
module load dft/pwscf
export PYTEST_WFL_ASE_ESPRESSO_COMMAND="mpirun -np 1 pw.x"
export PYTEST_WFL_ASE_ESPRESSO_COMMAND="env MPIRUN_EXTRA_ARGS='-np 1' pw.x"
# no ORCA

export OPENBLAS_NUM_THREADS=1
Expand Down
16 changes: 15 additions & 1 deletion docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,21 @@ Quick start that installs all of the mandatory dependencies:

.. code-block:: sh
pip install git+https://github.com/libAtoms/workflow
python3 -m pip install git+https://github.com/libAtoms/workflow
.. warning::

`wfl` requires ASE, so `ase` is listed as a `pip` dependency,
and if not already installed, `pip install` will install the latest
`pypi` release. However, because of the large delay in producing new
releases, the latest `pypi` version is often quite old, and `wfl`
has some functionality that requires a newer version. To ensure
a sufficiently up-to-date version is available, before installing
`wfl` install the latest `ase` from gitlab, with a command such as

.. code-block:: sh
python3 -m pip install git+https://gitlab.com/ase/ase
***************************************
Repository
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
name="wfl",
version="0.2.3",
packages=setuptools.find_packages(exclude=["tests"]),
install_requires=["click>=7.0", "numpy", "ase>=3.21", "pyyaml", "spglib", "docstring_parser",
install_requires=["click>=7.0", "numpy", "ase>=3.22.1", "pyyaml", "spglib", "docstring_parser",
"expyre-wfl @ https://github.com/libAtoms/ExPyRe/tarball/main",
"universalSOAP @ https://github.com/libAtoms/universalSOAP/tarball/main"],
entry_points="""
Expand Down
52 changes: 52 additions & 0 deletions tests/test_configset.py
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,58 @@ def test_mult_files_mult_Atoms(tmp_path, ats):
locs = [f" / {i0} / {i1}" for i0 in range(2) for i1 in range(5)]
check_ConfigSet(cs, locs, gather_numbers([ats[0:5], ats[5:10]]))

def test_mult_files_mult_Atoms_glob_file(tmp_path, ats):
print("CHECK mult file with mult Atoms using a glob for the filename")
ase.io.write(tmp_path / "ats_0.xyz", ats[0:5])
ase.io.write(tmp_path / "ats_1.xyz", ats[5:10])
locs = [f" / {i0} / {i1}" for i0 in range(2) for i1 in range(5)]

# file_root + glob in filename
cs = ConfigSet("ats_*.xyz", file_root=tmp_path)
check_ConfigSet(cs, locs, gather_numbers([ats[0:5], ats[5:10]]))

# glob in full pathname
cs = ConfigSet(tmp_path / "ats_*.xyz")
check_ConfigSet(cs, locs, gather_numbers([ats[0:5], ats[5:10]]))

# glob in absolute pathname
cs = ConfigSet(tmp_path.absolute() / "ats_*.xyz")
check_ConfigSet(cs, locs, gather_numbers([ats[0:5], ats[5:10]]))

def test_mult_files_mult_Atoms_glob_dir(tmp_path, ats):
print("CHECK mult file with mult Atoms using a glob for directory that contains the files")
(tmp_path / "dir_0").mkdir()
(tmp_path / "dir_1").mkdir()
ase.io.write(tmp_path / "dir_0" / "ats.xyz", ats[0:5])
ase.io.write(tmp_path / "dir_1" / "ats.xyz", ats[5:10])
locs = [f" / {i0} / {i1}" for i0 in range(2) for i1 in range(5)]

# glob for dir name, but same filename
cs = ConfigSet(tmp_path / "dir_*" / "ats.xyz")
check_ConfigSet(cs, locs, gather_numbers([ats[0:5], ats[5:10]]))

# workdir with glob for dir name, but same filename
cs = ConfigSet("dir_*/ats.xyz", file_root=tmp_path)
check_ConfigSet(cs, locs, gather_numbers([ats[0:5], ats[5:10]]))

def test_mult_files_mult_Atoms_mult_glob_dir(tmp_path, ats):
print("CHECK mult file with mult Atoms using multiple globs glob for directory that contains the files")
(tmp_path / "dir_0").mkdir()
(tmp_path / "dir_1").mkdir()
(tmp_path / "other_dir_0").mkdir()
ase.io.write(tmp_path / "dir_0" / "ats.xyz", ats[0:3])
ase.io.write(tmp_path / "dir_1" / "ats.xyz", ats[3:6])
ase.io.write(tmp_path / "other_dir_0" / "ats.xyz", ats[6:10])
locs = [f" / {i0} / {i1}" for i0, i1 in [(0, 0), (0, 1), (0,2), (1, 0), (1, 1), (1, 2), (2, 0), (2, 1), (2, 2), (2, 3)]]

# glob for dir name, but same filename
cs = ConfigSet([tmp_path / "dir_[01]" / "ats.xyz", tmp_path / "other_dir_*" / "ats.xyz"])
check_ConfigSet(cs, locs, gather_numbers([ats[0:3], ats[3:6], ats[6:10]]))

# workdir with glob for dir name, but same filename
cs = ConfigSet(["dir_[0-1]/ats.xyz", "other_dir_*/ats.xyz"], file_root=tmp_path)
check_ConfigSet(cs, locs, gather_numbers([ats[0:3], ats[3:6], ats[6:10]]))

def test_single_file_tree_Atoms(tmp_path, ats):
for i in range(0, 3):
ats[i].info["_ConfigSet_loc"] = f" / 0 / {i}"
Expand Down
56 changes: 56 additions & 0 deletions tests/test_remote_run.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@
import ase.io
from ase.atoms import Atoms

from ase.build import bulk
from ase.calculators.emt import EMT

import pytest
Expand All @@ -21,6 +22,7 @@
from wfl.generate import optimize, md
from wfl.calculators import generic
from wfl.calculators.vasp import Vasp
from wfl.calculators.espresso import Espresso
from wfl.autoparallelize import AutoparaInfo

from expyre.func import ExPyReJobDiedError
Expand All @@ -33,6 +35,14 @@ def test_generic_calc(tmp_path, expyre_systems, monkeypatch, remoteinfo_env):
do_generic_calc(tmp_path, sys_name, monkeypatch, remoteinfo_env)


def test_generic_calc_qe(tmp_path, expyre_systems, monkeypatch, remoteinfo_env):
for sys_name in expyre_systems:
if sys_name.startswith('_'):
continue

do_generic_calc_qe(tmp_path, sys_name, monkeypatch, remoteinfo_env)


def test_minim(tmp_path, expyre_systems, monkeypatch, remoteinfo_env):
for sys_name in expyre_systems:
if sys_name.startswith('_'):
Expand Down Expand Up @@ -174,6 +184,52 @@ def do_generic_calc(tmp_path, sys_name, monkeypatch, remoteinfo_env):
assert dt_rerun < dt / 4.0


# copied from calculators/test_qe.py::test_qe_calc
def do_generic_calc_qe(tmp_path, sys_name, monkeypatch, remoteinfo_env):
ri = {'sys_name': sys_name, 'job_name': 'pytest_'+sys_name,
'resources': {'max_time': '1h', 'num_nodes': 1},
'num_inputs_per_queued_job': -36, 'check_interval': 10}

qe_cmd = os.environ.get("PYTEST_WFL_ASE_ESPRESSO_COMMAND")
if qe_cmd is None:
pytest.skip("no PYTEST_WFL_ASE_ESPRESSO_COMMAND to specify executable")
pspot = tmp_path / "Si.UPF"
shutil.copy(Path(__file__).parent / "assets" / "QE" / "Si.pz-vbc.UPF", pspot)

remoteinfo_env(ri)
print('RemoteInfo', ri)

at = bulk("Si")
at.positions[0, 0] += 0.01
at0 = Atoms("Si", cell=[6.0, 6.0, 6.0], positions=[[3.0, 3.0, 3.0]], pbc=False)

kw = dict(
pseudopotentials=dict(Si=pspot.name),
input_data={"SYSTEM": {"ecutwfc": 40, "input_dft": "LDA",}},
kpts=(2, 2, 2),
conv_thr=0.0001,
calculator_exec=qe_cmd,
pseudo_dir=str(pspot.parent)
)

calc = (Espresso, [], kw)

# output container
c_out = OutputSpec("qe_results.xyz", file_root=tmp_path)

results = generic.calculate(
inputs=[at0, at],
outputs=c_out,
calculator=calc,
output_prefix='QE_',
autopara_info={"remote_info": ri}
)

for at in results:
assert "QE_energy" in at.info
assert "QE_forces" in at.arrays


def do_minim(tmp_path, sys_name, monkeypatch, remoteinfo_env):
ri = {'sys_name': sys_name, 'job_name': 'pytest_'+sys_name,
'resources': {'max_time': '1h', 'num_nodes': 1},
Expand Down
Loading

0 comments on commit ffaddfa

Please sign in to comment.