Skip to content

Commit

Permalink
update docs
Browse files Browse the repository at this point in the history
  • Loading branch information
yomichi committed Nov 5, 2024
1 parent a04fcc5 commit f6c38dc
Show file tree
Hide file tree
Showing 6 changed files with 157 additions and 77 deletions.
36 changes: 29 additions & 7 deletions docs/sphinx/en/source/how_to_use/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -133,12 +133,21 @@ and abICS will take care of generating the coordinates section at each sampling
Machine learning trainer/calculator-specific notes
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

- abICS control file

- In the ``[solver]`` section, set ``perturb`` be 0.0.

.. code-block:: bash
type = "aenet"
perturb = 0.0
aenet
*****

- URL : http://ann.atomistic.net

- Checked with version 2.0.4.
- Checked with aenet 2.0.4.

- Reference file rules

Expand All @@ -151,17 +160,30 @@ aenet
- Place the input file ``predict.in`` for ``predict.x`` in the ``predict`` directory to evaluate the energy for the input coordinates using the trained potential model.


- abICS control file
NequIP
******

- In the ``[solver]`` section, for ``type`` , ``perturb`` , and ``run_scheme``, set the following if using an active learning scheme.
- URL : https://github.com/mir-group/nequip

.. code-block:: bash
- Checked with nequip 0.6.1.

type = “aenet”
perturb = 0.0
run_scheme = ‘subprocess’
- Reference file(For specific examples of reference files, see the tutorial)

- Place the input file for NequIP ``input.yaml`` in the ``train`` directory in the directory set in the ``base_input_dir`` of the ``[trainer]`` section.

- Set the RATIO of training data and validation data in ``n_train`` and ``n_val``. For example, if you set ``n_train = 80%`` and ``n_val = 20%``, the ratio of training data and validation data will be 80% and 20%, respectively.


MLIP-3
******

- URL : https://gitlab.com/ashapeev/mlip-3

- Checked with commit hash 5f6970e3966c5941a4b42b27a3e9170f162532a0 (2023-06-06T21:27:11).

- Reference file(For specific examples of reference files, see the tutorial)

- Place the input file for MLIP-3 ``input.almtp`` in the ``train`` directory in the directory set in the ``base_input_dir`` of the ``[trainer]`` section.

Creating a set of training data
--------------------------------
Expand Down
34 changes: 31 additions & 3 deletions docs/sphinx/en/source/inputfiles/parameter_train.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@

``abics_train`` creates and trains a regression model from configurations to energies.
Indeed, ``abics_train`` uses an external program to train the model.
In the current version, only ``aenet`` is supported as an external program.
In the current version, Aenet, Nequip, and MLIP-3 are supported as an external program.
For software-specific notes (such as input file names), see :ref:`trainer_specific_notes`.

The input information for ``abics_train`` is described in the ``[trainer]`` section. The description of each parameter is as follows.
Expand Down Expand Up @@ -33,7 +33,7 @@ Key words

**Format :** str

**Description :** The trainer to generate the neural network potential (currently 'aenet', 'nequip', 'allegro', and 'mlip_3').
**Description :** The trainer to generate the neural network potential (currently 'aenet', 'nequip', and 'mlip_3' are available).

- ``base_input_dir``

Expand All @@ -44,10 +44,38 @@ Key words

- ``exe_command``

**Format :** list of str
**Format :** dict

**Description :**
List of commands to execute; if you use aenet, you need to specify the path to ``generate.x`` and ``train.x``.

- ``type = 'aenet'``

- ``generate`` and ``train`` keys are required.
- ``generate``

- Specify the path to ``generate.x`` of aenet.

- ``train``

- Specify the path to ``train.x`` of aenet.
- The MPI parallel version is available. In that case, set the command to execute MPI (e.g., ``srun``, ``mpirun``) .

- Array format is supported for compatibility with abICS 2.0 and earlier.
The first element is ``generate``, and the second element is ``train``.

- ``type = 'nequip'``

- ``train``

- Specify the path to ``nequip-train``.

- ``type = 'mlip_3'``

- ``train``

- Specify the path to ``mlp``.


- ``ignore_species``

Expand Down
49 changes: 23 additions & 26 deletions docs/sphinx/en/source/tutorial/other_models.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,15 +20,15 @@ Install it with the following command.

.. code-block:: bash
$ pip3 install wandb
$ pip3 install nequip
$ python3 -m pip install wandb
$ python3 -m pip install nequip
Also, when installing abICS, you can install NequIP by specifying the [nequip] option.

.. code-block:: bash
$ cd /path/to/abics
$ pip3 install .[nequip]
$ python3 -m pip install '.[nequip]'
Preparation of input files
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Expand All @@ -42,18 +42,17 @@ Below, we extract [sampling.solver] and [train] with changes from the aenet inpu
type = 'nequip'
base_input_dir = './baseinput_nequip'
perturb = 0.0
#run_scheme = 'subprocess' #'mpi_spawn_ready'
ignore_species = ["O"]
[train]
type = 'nequip'
base_input_dir = './nequip_train_input'
exe_command = ['', 'nequip-train']
exe_command = { train = 'nequip-train' }
ignore_species = ["O"]
vac_map = []
restart = false
Also, create the NequIP input file input.yaml in the nequip_train_input/train directory.
Also, create the NequIP input file ``input.yaml`` in the ``nequip_train_input/train`` directory.

.. code-block:: yaml
Expand Down Expand Up @@ -100,8 +99,8 @@ Also, create the NequIP input file input.yaml in the nequip_train_input/train di
# verbose: debug
# training
n_train: 70
n_val: 10
n_train: 80%
n_val: 20%
batch_size: 5
train_val_split: random
#shuffle: true
Expand All @@ -120,44 +119,43 @@ The procedure of model learning and sampling is the same as aenet.
Sampling with Allegro
----------------------------------------------

Models implemented as extensions of NequIP can be used as is by installing the extension package and setting the input file of NequIP appropriately. Allegro is one of the extension packages.

Installation of Allegro
~~~~~~~~~~~~~~~~~~~~~~~~~~~~

To use ``allegro``, you need to install Allegro.

Install it with the following command.
Install Allegro with the following command.

.. code-block:: bash
$ git clone --depth 1 https://github.com/mir-group/allegro.git
$ cd allegro
$ pip3 install .
$ python3 -m pip install .
Preparation of input files
~~~~~~~~~~~~~~~~~~~~~~~~~~~~

First, prepare input_allegro.toml and set the parameters required to run Allegro.
Below, we extract [sampling.solver] and [train] with changes from the aenet input.
Below, we extract ``[sampling.solver]`` and ``[train]`` with changes from the aenet input.

.. code-block:: toml
[sampling.solver]
type = 'allegro'
base_input_dir = './baseinput_allegro'
perturb = 0.0
#run_scheme = 'subprocess' #'mpi_spawn_ready'
ignore_species = ["O"]
[train]
type = 'allegro'
base_input_dir = './allegro_train_input'
exe_command = ['', 'nequip-train']
exe_command = {train = 'nequip-train'}
ignore_species = ["O"]
vac_map = []
restart = false
Also, create the Allegro input file input.yaml in the allegro_train_input/train directory.
Also, create the Allegro input file ``input.yaml`` in the ``allegro_train_input/train`` directory.

.. code-block:: yaml
Expand All @@ -174,7 +172,6 @@ Also, create the Allegro input file input.yaml in the allegro_train_input/train
r_max: 8.0
parity: o3_full
num_layers: 2
num_features: 16
env_embed_multiplicity: 16
embed_initial_edge: true
Expand Down Expand Up @@ -208,8 +205,8 @@ Also, create the Allegro input file input.yaml in the allegro_train_input/train
# verbose: debug
# training
n_train: 70
n_val: 10
n_train: 80%
n_val: 20%
batch_size: 5
train_val_split: random
#shuffle: true
Expand Down Expand Up @@ -246,8 +243,8 @@ Install it with the following command.
Preparation of input files
~~~~~~~~~~~~~~~~~~~~~~~~~~~~

First, prepare input_mlip3.toml and set the parameters required to run MLIP-3.
Below, we extract [sampling.solver] and [train] with changes from the aenet input.
First, prepare ``input_mlip3.toml`` and set the parameters required to run MLIP-3.
Below, we extract ``[sampling.solver]`` and ``[train]`` with changes from the aenet input.

.. code-block:: toml
Expand All @@ -256,22 +253,22 @@ Below, we extract [sampling.solver] and [train] with changes from the aenet inpu
path= '~/github/mlip-3/bin/mlp'
base_input_dir = './baseinput'
perturb = 0.0
run_scheme = 'subprocess' #'mpi_spawn_ready'
run_scheme = 'subprocess'
ignore_species = ["O"]
[train]
type = 'mlip_3'
base_input_dir = './mlip_3_train_input'
exe_command = ['~/github/mlip-3/bin/mlp','~/github/mlip-3/bin/mlp']
exe_command = { train = '~/github/mlip-3/bin/mlp'}
ignore_species = ["O"]
vac_map = []
restart = false
In the above, the path in [sampling.solver] and the exe_command list in [train]
specify the path to the MLIP-3 executable file mlp.
In the above, the ``path`` in ``[sampling.solver]`` and the ``exe_command`` in ``[train]``
specify the path to the MLIP-3 executable file ``mlp`` .
Please change them according to your environment.

Also, create the MLIP-3 input file input.almtp in the mlip_3_train_input/train directory.
Also, create the MLIP-3 input file ``input.almtp`` in the ``mlip_3_train_input/train`` directory.

.. code-block:: none
Expand Down
31 changes: 23 additions & 8 deletions docs/sphinx/ja/source/how_to_use/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -142,7 +142,7 @@ aenet

- URL : http://ann.atomistic.net

- バージョン2.0.4 で動作確認済。
- aenet 2.0.4 で動作確認済。

- 参照ファイル(参照ファイルの具体例についてはチュートリアル参照)

Expand All @@ -155,16 +155,31 @@ aenet
- 訓練したポテンシャルモデルを使って入力座標に対してエネルギーを 評価するための ``predict.x`` 用の入力ファイル ``predict.in`` を、 ``predict`` ディレクトリに設置してください。


- abICS 入力ファイル
NequIP
******

- URL : https://github.com/mir-group/nequip

- NequIP 0.6.1 で動作確認済。

- 参照ファイル(参照ファイルの具体例についてはチュートリアル参照)

- NequIP用の入力ファイル ``input.yaml`` を ``[train]`` セクションの ``base_input_dir`` で設定したディレクトリ内の ``train`` ディレクトリに設置してください。

- ``n_train`` と ``n_val`` には、訓練データと検証データの「割合」を指定してください. 例えば、 ``n_train = 80%`` 、 ``n_val = 20%`` と指定すると、訓練データと検証データの割合がそれぞれ80%、20%になります。

- ``[solver]`` セクションで ``type`` , ``perturb`` , ``run_scheme`` に関しては、能動学習スキームを用いる場合は以下に設定してください。

.. code-block:: bash
MLIP-3
******

- URL : https://gitlab.com/ashapeev/mlip-3

- コミットハッシュ 5f6970e3966c5941a4b42b27a3e9170f162532a0 (2023-06-06T21:27:11) で動作確認済。

- 参照ファイル(参照ファイルの具体例についてはチュートリアル参照)

- MLIP-3用の入力ファイル ``input.almtp`` を ``[train]`` セクションの ``base_input_dir`` で設定したディレクトリ内の ``train`` ディレクトリに設置してください。

type = “aenet”
perturb = 0.0
run_scheme = ‘subprocess’

学習データの作成
-------------------
Expand Down
40 changes: 30 additions & 10 deletions docs/sphinx/ja/source/inputfiles/parameter_train.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
訓練データから配置エネルギー予測モデルを学習する学習器の設定を行います。

予測モデルの作成・学習には外部のプログラムを利用します。
現在はaenet, nequip, MLIP-3に対応しています。
現在はaenet, NequIP, MLIP-3に対応しています。
ソフトウェア固有の注意事項(入力ファイル名など)は :ref:`trainer_specific_notes` を参照してください.

本セクションは以下のようなファイルフォーマットをしています.
Expand All @@ -16,9 +16,10 @@
[train] # モデル学習器の設定
type = 'aenet'
base_input_dir = './aenet_train_input'
exe_command = ['~/git/aenet/bin/generate.x-2.0.4-ifort_serial',
'srun ~/git/aenet/bin/train.x-2.0.4-ifort_intelmpi']
ignore_species = ["O"]
[train.exe_command]
generate = '~/git/aenet/bin/generate.x-2.0.4-ifort_serial'
train = 'srun ~/git/aenet/bin/train.x-2.0.4-ifort_intelmpi'
入力形式
Expand Down Expand Up @@ -48,20 +49,39 @@

- ``exe_command``

**形式 :** strのlist型
**形式 :** 辞書型

**説明 :**
学習器で使う実行ファイルへのパスを指定します.
学習器で使う実行コマンドを指定します.
コマンドライン引数も指定できますが, それぞれの学習機の入力ファイル (``input.yaml`` など)は含めないようにしてください.

- ``type = 'aenet'``

- ``generate.x`` と ``train.x`` へのパスをそれぞれ指定します.
aenetの ``generate.x`` と ``train.x`` へのパスを指定します. 特に、 ``train.x`` についてはMPI並列版が利用可能です. その場合、上の例で示すように、MPI実行するためのコマンド( ``srun`` 、 ``mpirun`` など)を合わせて設定してください。
- ``generate`` と ``train`` の2つのキーを持ちます.
- ``generate``

- ``type = 'aenet'``
- aenetの ``generate.x`` へのパスを指定します.

- ``train``

- aenetの ``train.x`` へのパスを指定します.
- MPI並列版が利用可能です. その場合、上の例で示すように、MPI実行するためのコマンド( ``srun`` 、 ``mpirun`` など)を合わせて設定してください。

- abICS 2.0 以前との互換性のために、配列形式もサポートしています.
最初の要素が ``generate``, 2番目の要素が ``train`` です.

- ``type = 'nequip'``

- ``train``

- ``nequip-train`` へのパスを指定します.

- ``type = 'mlip_3'``

- ``train``

- ``mlp`` へのパスを指定します.

- ``generate.x`` と ``train.x`` へのパスをそれぞれ指定します.
aenetの ``generate.x`` と ``train.x`` へのパスを指定します. 特に、 ``train.x`` についてはMPI並列版が利用可能です. その場合、上の例で示すように、MPI実行するためのコマンド( ``srun`` 、 ``mpirun`` など)を合わせて設定してください。

- ``ignore_species``

Expand Down
Loading

0 comments on commit f6c38dc

Please sign in to comment.