Skip to content

Commit

Permalink
DOC: Improve documentation at Getting Started section
Browse files Browse the repository at this point in the history
  • Loading branch information
acsenrafilho committed Nov 16, 2024
1 parent a5ad9e7 commit 2c4b105
Showing 1 changed file with 37 additions and 14 deletions.
51 changes: 37 additions & 14 deletions docs/getting_started.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,54 +27,77 @@ The example above creates an `asldata.ASLData` object, called `data`, which will

The following examples can represents some situations on ASL data processing:

1. Load a pCASL and M0 data for basic CBF/ATT mapping
1. Load a pCASL and M0 data and reconstruct a basic CBF/ATT mapping

```python
from asltk.asldata import ASLData
from asltk.reconstruction import CBFMapping

data = ASLData(
pcasl='./tests/files/pcasl_mte.nii.gz',
m0='path/to/m0.nii.gz',
ld_values=[100.0, 150.0, 250.0],
pld_values=[100.0, 120.0, 150.0],
te_values=[12.5, 36.7, 85.5])

cbf = CBFMapping(data) # insert the ASLData in the CBFMapping instance
out = cbf.create_map() # Effectivelly creates the CBF/ATT maps

out.get('cbf') # Get the CBF map image
out.get('att') # Get the ATT map image
```

!!! info
This example uses a multiTE-ASL data to reconstruct the CBF/ATT maps. However, this is only a simple representation. The same output can be found using a simple pCASL aquisition without the TE values being acquired. In this case, the `te_values` passed to the `ASLDATA` constructor is not necessary.

2. Load a pCASL and M0 data for multiTE-ASL processing obtaingin the T1 exchange from blood to Grey Matter mapping

```python
from asltk.asldata import ASLData
from asltk.reconstruction import CBFMapping

data = ASLData(
pcasl='./tests/files/pcasl_mte.nii.gz',
m0='path/to/m0.nii.gz',
ld_values=[100.0, 150.0, 250.0],
pld_values=[100.0, 120.0, 150.0],
te_values=[12.5, 36.7, 85.5])

cbf = CBFMapping(data) # insert the ASLData in the CBFMapping instance
out = cbf.create_map() # Effectivelly creates the CBF/ATT maps

out.get('cbf') # Get the CBF map image
out.get('att') # Get the ATT map image
```

!!! note
By default, all the `reconstruction` classes uses a multiprocessing strategy for CPU threads. This will accelerate the mapping creation. As a internal standard, all the available CPU cores in the machine is recruited. If there is a limitation to your machine/experiment that is not allowed such quantity of CPUs, then a lower value can be set using the `cores` parameter in the `create_map` method.

## Load and process an image

As as standard notation, the `asltk` library assumes that all the image data files are storaged and manipulated using `numpy` objects. Therefore, the following code snippets can represents the `load_image` and `save_image` pattern adopted in the tool:

1. Loading and image
1. Loading and saving an image

```python
from asltk import utils
from asltk.utils import load_image, save_image

img = utils.load_image('path/to/pcasl.nii.gz')
img = load_image('path/to/pcasl.nii.gz') #Loading an image
type(img)
< numpy.ndarray >

save_image(img, 'path/to/save/the/image.nrrd') # Saving the image using NRRD file format

```

!!! warning
The `asltk` uses the `SimpleITK` library to load and save images due to it's long list of image format options, e.g. NifTI, Nrrd, MHA, etc. However, in order to transpose to `numpy.array` data format for image processing, it is important to note that the image space rasteting format relies as follows:
The `asltk` uses the `SimpleITK` library to load and save images due to it's long list of image format options, e.g. NifTI, NRRD, MHA, etc. However, in order to transpose to `numpy.array` data format for image processing, it is important to note that the image space rasteting format relies as follows:
```
SimpleITK -> (x,y,z,...,n_d)
Numpy -> (n_d, ..., z,y,x)
```
Where the `n_d` represents a higher order dimension in the data.

2. Saving and image

```python
from asltk import utils

img = utils.load_image('path/to/pcasl.nii.gz')
utils.save_image(img, 'path/to/save/image.nii.gz')
```

Basically, one can use out tool for the general pipeline:

* Loading a data and metadata,
Expand Down

0 comments on commit 2c4b105

Please sign in to comment.