Skip to content

Commit

Permalink
Update Getting started so its up to date (#13)
Browse files Browse the repository at this point in the history
Signed-off-by: Kelly Brown <[email protected]>
  • Loading branch information
kelbrown20 authored Nov 21, 2024
1 parent 6174543 commit 85bc66e
Show file tree
Hide file tree
Showing 4 changed files with 214 additions and 248 deletions.
118 changes: 66 additions & 52 deletions docs/getting-started/initilize_ilab.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,77 +6,91 @@ logo: images/ilab_dog.png

# 🏗️ Initialize `ilab`

1) Initialize `ilab` by running the following command:
### 🏗️ Initialize `ilab`

```shell
ilab config init
```
1. Initialize `ilab` by running the following command:

*Example output*
```shell
ilab config init
```

```shell
Welcome to InstructLab CLI. This guide will help you set up your environment.
Please provide the following values to initiate the environment [press Enter for defaults]:
Path to taxonomy repo [taxonomy]: <ENTER>
```
2. When prompted, clone the `https://github.com/instructlab/taxonomy.git` repository into the current directory by typing **enter**

2) When prompted by the interface, press **Enter** to add a new default `config.yaml` file.
**Optional**: If you want to point to an existing local clone of the `taxonomy` repository, you can pass the path interactively or alternatively with the `--taxonomy-path` flag.

3) When prompted, clone the `https://github.com/instructlab/taxonomy.git` repository into the current directory by typing **y**.
`ilab` will use the default configuration file unless otherwise specified. You can override this behavior with the `--config` parameter for any `ilab` command.

**Optional**: If you want to point to an existing local clone of the `taxonomy` repository, you can pass the path interactively or alternatively with the `--taxonomy-path` flag.
3. When prompted, provide the path to your default model. Otherwise, the default of a quantized [Merlinite](https://huggingface.co/instructlab/merlinite-7b-lab-GGUF) model is used.

*Example output after initializing `ilab`*
*Example output of steps 1 - 3*

```shell
(venv) $ ilab config init
Welcome to InstructLab CLI. This guide will help you set up your environment.
----------------------------------------------------
Welcome to the InstructLab CLI
This guide will help you to setup your environment
----------------------------------------------------

Please provide the following values to initiate the environment [press Enter for defaults]:
Path to taxonomy repo [taxonomy]: <ENTER>
`taxonomy` seems to not exists or is empty. Should I clone https://github.com/instructlab/taxonomy.git for you? [y/N]: y
Cloning https://github.com/instructlab/taxonomy.git...
Path to taxonomy repo [/Users/kellybrown/.local/share/instructlab/taxonomy]:
Path to your model [/Users/kellybrown/.cache/instructlab/models/merlinite-7b-lab-Q4_K_M.gguf]:
```

`ilab` will use the default configuration file unless otherwise specified. You can override this behavior with the `--config` parameter for any `ilab` command.

4) When prompted, provide the path to your default model. Otherwise, the default of a quantized [Merlinite](https://huggingface.co/instructlab/merlinite-7b-lab-GGUF) model will be used - you can download this model with `ilab model download`. The following example output displays the paths of a Mac instance.
You can download this model with `ilab model download` command as well.

```shell
(venv) $ ilab config init
Welcome to InstructLab CLI. This guide will help you set up your environment.
Please provide the following values to initiate the environment [press Enter for defaults]:
Path to taxonomy repo [taxonomy]: <ENTER>
`taxonomy` seems to not exists or is empty. Should I clone https://github.com/instructlab/taxonomy.git for you? [y/N]: y
Cloning https://github.com/instructlab/taxonomy.git...
Path to your model [/Users/USERNAME/Library/Caches/instructlab/models/merlinite-7b-lab-Q4_K_M.gguf]: <ENTER>
```
4. The InstructLab CLI auto-detects your hardware and select the exact system profile that matches your machine. System profiles populate the `config.yaml` file with the proper parameter values based on your detected GPU types and avaiible vRAM.

5) When prompted, please choose a train profile. Train profiles are GPU specific profiles that enable accelerated training behavior. **YOU ARE ON MacOS**, please choose `No Profile (CPU, Apple Metal, AMD ROCm)` by hitting Enter. There are various flags you can utilize with individual `ilab` commands that will allow you to utilize your GPU if applicable. The following example output uses the Linux paths.
*Example output of profile auto-detection*

```shell
Welcome to InstructLab CLI. This guide will help you to setup your environment.
Please provide the following values to initiate the environment [press Enter for defaults]:
Path to taxonomy repo [/home/user/.local/share/instructlab/taxonomy]:
Path to your model [/home/user/.cache/instructlab/models/merlinite-7b-lab-Q4_K_M.gguf]:
Generating `/home/user/.config/instructlab/config.yaml` and `/home/user/.local/share/instructlab/internal/train_configuration/profiles`...
Please choose a train profile to use.
Train profiles assist with the complexity of configuring specific GPU hardware with the InstructLab Training library.
You can still take advantage of hardware acceleration for training even if your hardware is not listed.
[0] No profile (CPU, Apple Metal, AMD ROCm)
[1] Nvidia A100/H100 x2 (A100_H100_x2.yaml)
[2] Nvidia A100/H100 x4 (A100_H100_x4.yaml)
[3] Nvidia A100/H100 x8 (A100_H100_x8.yaml)
[4] Nvidia L40 x4 (L40_x4.yaml)
[5] Nvidia L40 x8 (L40_x8.yaml)
[6] Nvidia L4 x8 (L4_x8.yaml)
Enter the number of your choice [hit enter for no profile] [0]:
No profile selected - any hardware acceleration for training must be configured manually.
Initialization completed successfully, you're ready to start using `ilab`. Enjoy!
Generating config file and profiles:
/home/user/.config/instructlab/config.yaml
/home/user/.local/share/instructlab/internal/train_configuration/profiles

We have detected the AMD CPU profile as an exact match for your system.

--------------------------------------------
Initialization completed successfully!
You're ready to start using `ilab`. Enjoy!
--------------------------------------------
```
5. If there is not an exact match for your system, you can manually select a system profile when prompted. There are various flags you can utilize with individual `ilab` commands that allow you to utilize your GPU if applicable.
*Example output of selecting a system profile*
```shell
Please choose a system profile to use.
System profiles apply to all parts of the config file and set hardware specific defaults for each command.
First, please select the hardware vendor your system falls into
[1] APPLE
[2] INTEL
[3] AMD
[4] NVIDIA
Enter the number of your choice [0]: 1
You selected: APPLE
Next, please select the specific hardware configuration that most closely matches your system.
[0] No system profile
[1] APPLE M1 ULTRA
[2] APPLE M1 MAX
[3] APPLE M2 MAX
[4] APPLE M2 ULTRA
[5] APPLE M2 PRO
[6] APPLE M2
[7] APPLE M3 MAX
[8] APPLE M3 PRO
[9] APPLE M3
Enter the number of your choice [hit enter for hardware defaults] [0]: 8
You selected: /Users/kellybrown/.local/share/instructlab/internal/system_profiles/apple/m3/m3_pro.yaml
--------------------------------------------
Initialization completed successfully!
You're ready to start using `ilab`. Enjoy!
--------------------------------------------
```

The GPU profiles are listed by GPU type and number. If you happen to have a GPU configuration with a similar amount of VRAM as any of the above profiles, feel free to try them out!
The GPU profiles are listed by GPU type and number of GPUs present. If you happen to have a GPU configuration with a similar amount of vRAM as any of the above profiles, feel free to try them out!

## `ilab` directory layout after initializing your system
### `ilab` directory layout after initializing your system

### Mac directory

Expand Down
138 changes: 61 additions & 77 deletions docs/getting-started/linux_amd.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,17 +12,17 @@ logo: images/ilab_dog.png
These steps will pull down a premade `qna.yaml` so you can do a local build. Skip the `wget`, `mv`, and `ilab taxonomy diff` if you don't want to do this.

```bash
python3.11 -m venv venv-instructlab-0.18-3.11
source venv-instructlab-0.18-3.11/bin/activate
python3.11 -m venv --upgrade-deps venv
source venv/bin/activate
pip cache remove llama_cpp_python
pip install 'instructlab[rocm]' \
--extra-index-url https://download.pytorch.org/whl/rocm6.0 \
-C cmake.args="-DLLAMA_HIPBLAS=on" \
-C cmake.args="-DAMDGPU_TARGETS=all" \
-C cmake.args="-DCMAKE_C_COMPILER=/opt/rocm/llvm/bin/clang" \
-C cmake.args="-DCMAKE_CXX_COMPILER=/opt/rocm/llvm/bin/clang++" \
-C cmake.args="-DCMAKE_PREFIX_PATH=/opt/rocm" \
-C cmake.args="-DLLAMA_NATIVE=off"
CMAKE_ARGS="-DLLAMA_HIPBLAS=on \
-DAMDGPU_TARGETS=all \
-DCMAKE_C_COMPILER=/opt/rocm/llvm/bin/clang \
-DCMAKE_CXX_COMPILER=/opt/rocm/llvm/bin/clang++ \
-DCMAKE_PREFIX_PATH=/opt/rocm \
-DLLAMA_NATIVE=off" \
pip install 'instructlab[rocm]' \
--extra-index-url https://download.pytorch.org/whl/rocm6.0
which ilab
ilab config init
cd ~/.local/share/instructlab
Expand All @@ -35,85 +35,69 @@ ilab model train
ilab model convert --model-dir checkpoints/instructlab-granite-7b-lab-mlx-q
ilab model serve --model-path instructlab-granite-7b-lab-trained/instructlab-granite-7b-lab-Q4_K_M.gguf
```

## Installing `ilab`

1) Create a new directory called `instructlab` to store the files the `ilab` CLI needs when running and `cd` into the directory by running the following command:

```shell
mkdir instructlab
cd instructlab
```
The following steps in this document use [Python venv](https://docs.python.org/3/library/venv.html) for virtual environments. However, if you use another tool such as [pyenv](https://github.com/pyenv/pyenv) or [Conda Miniforge](https://github.com/conda-forge/miniforge) for managing Python environments on your machine continue to use that tool instead. Otherwise, you may have issues with packages that are installed but not found in `venv`.

!!! note
The following steps in this document use [Python venv](https://docs.python.org/3/library/venv.html) for virtual environments. However, if you use another tool such as [pyenv](https://github.com/pyenv/pyenv) or [Conda Miniforge](https://github.com/conda-forge/miniforge) for managing Python environments on your machine continue to use that tool instead. Otherwise, you may have issues with packages that are installed but not found in `venv`.
`pip install` may take some time, depending on your internet connection. In case installation fails with error ``unsupported instruction `vpdpbusd'``, append `-C cmake.args="-DLLAMA_NATIVE=off"` to `pip install` command.

2) There are a few ways you can locally install the `ilab` CLI. Select your preferred installation method from the following instructions. You can then install `ilab` and activate your `venv` environment.
1) Install with AMD ROCm

```bash
python3 -m venv --upgrade-deps venv
source venv/bin/activate
pip cache remove llama_cpp_python
pip install 'instructlab[rocm]' \
--extra-index-url https://download.pytorch.org/whl/rocm6.0 \
-C cmake.args="-DLLAMA_HIPBLAS=on" \
-C cmake.args="-DAMDGPU_TARGETS=all" \
-C cmake.args="-DCMAKE_C_COMPILER=/opt/rocm/llvm/bin/clang" \
-C cmake.args="-DCMAKE_CXX_COMPILER=/opt/rocm/llvm/bin/clang++" \
-C cmake.args="-DCMAKE_PREFIX_PATH=/opt/rocm" \
-C cmake.args="-DLLAMA_NATIVE=off"
```

!!! note
`pip install` may take some time, depending on your internet connection. In case installation fails with error ``unsupported instruction `vpdpbusd'``, append `-C cmake.args="-DLLAMA_NATIVE=off"` to `pip install` command.
On Fedora 40+, use `-DCMAKE_C_COMPILER=clang-17` and `-DCMAKE_CXX_COMPILER=clang++-17.`

3) Install with AMD ROCm
2) From your `venv` environment, verify `ilab` is installed correctly, by running the `ilab` command.

```bash
python3 -m venv --upgrade-deps venv
source venv/bin/activate
pip cache remove llama_cpp_python
pip install 'instructlab[rocm]' \
--extra-index-url https://download.pytorch.org/whl/rocm6.0 \
-C cmake.args="-DLLAMA_HIPBLAS=on" \
-C cmake.args="-DAMDGPU_TARGETS=all" \
-C cmake.args="-DCMAKE_C_COMPILER=/opt/rocm/llvm/bin/clang" \
-C cmake.args="-DCMAKE_CXX_COMPILER=/opt/rocm/llvm/bin/clang++" \
-C cmake.args="-DCMAKE_PREFIX_PATH=/opt/rocm" \
-C cmake.args="-DLLAMA_NATIVE=off"
```
```shell
ilab
```

On Fedora 40+, use `-DCMAKE_C_COMPILER=clang-17` and `-DCMAKE_CXX_COMPILER=clang++-17.`
*Example output of the `ilab` command*

4) From your `venv` environment, verify `ilab` is installed correctly, by running the `ilab` command.
```shell
(venv) $ ilab
Usage: ilab [OPTIONS] COMMAND [ARGS]...
```shell
ilab
```
CLI for interacting with InstructLab.
*Example output of the `ilab` command*

```shell
(venv) $ ilab
Usage: ilab [OPTIONS] COMMAND [ARGS]...

CLI for interacting with InstructLab.

If this is your first time running InstructLab, it's best to start with `ilab config init` to create the environment.
Options:
--config PATH Path to a configuration file. [default:
/home/user/.config/instructlab/config.yaml]
-v, --verbose Enable debug logging (repeat for even more verbosity)
--version Show the version and exit.
--help Show this message and exit.
Commands:
config Command Group for Interacting with the Config of InstructLab.
data Command Group for Interacting with the Data generated by...
model Command Group for Interacting with the Models in InstructLab.
system Command group for all system-related command calls
taxonomy Command Group for Interacting with the Taxonomy of InstructLab.
Aliases:
chat model chat
convert model convert
diff taxonomy diff
download model download
evaluate model evaluate
generate data generate
init config init
list model model_list
serve model serve
sysinfo system info
test model test
train model train
```
If this is your first time running ilab, it's best to start with `ilab
config init` to create the environment.
Options:
--config PATH Path to a configuration file. [default:
/Users/kellybrown/.config/instructlab/config.yaml]
-v, --verbose Enable debug logging (repeat for even more verbosity)
--version Show the version and exit.
--help Show this message and exit.
Commands:
config Command Group for Interacting with the Config of InstructLab.
data Command Group for Interacting with the Data generated by...
model Command Group for Interacting with the Models in InstructLab.
system Command group for all system-related command calls
taxonomy Command Group for Interacting with the Taxonomy of InstructLab.
Aliases:
chat model chat
generate data generate
serve model serve
train model train
```
!!! important
Every `ilab` command needs to be run from within your Python virtual environment. You can enter the Python environment by running the `source venv/bin/activate` command.
Expand Down
Loading

0 comments on commit 85bc66e

Please sign in to comment.