Skip to content

Commit

Permalink
Update README.md (#47)
Browse files Browse the repository at this point in the history
  • Loading branch information
VMarsocci authored Sep 18, 2024
1 parent 6a43648 commit 5d07681
Showing 1 changed file with 71 additions and 21 deletions.
92 changes: 71 additions & 21 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
[![Tests](https://github.com/yurujaja/geofm-bench/actions/workflows/python-test.yml/badge.svg)](https://github.com/yurujaja/geofm-bench/actions/workflows/python-test.yml)

# TITLE

## 📚 Introduction

While geospatial foundation models (GFMs) have proliferated rapidly, their evaluations remain inconsistent and narrow. Existing works often utilize suboptimal downstream datasets (e.g., EuroSAT) and tasks (e.g., land cover classification), which constrain comparability and real-world usability. Additionally, a lack of diversity in evaluation protocols, including image resolution and sensor types, further complicates the extensive assessments of GFM performance. To bridge this gap, we propose a standardized evaluation protocol that incorporates a wide-ranging selection of datasets, tasks, resolutions, and sensor types, establishing a robust and widely applicable benchmark for GFMs.
Expand Down Expand Up @@ -36,12 +38,14 @@ And the following **datasets**:
| AI4SmallFarms | | | | | | |
| BioMassters | | | | | | |

The repository supports the following **tasks**:
- unitemporal semantic segmentation
- multi-temporal semantic segmentation
- unitemporal regression
- multi-temporal regression
- change detection
The repository supports the following **tasks** using GFMs:
- [single temporal semantic segmentation](#single-temporal-semantic-segmentation)
- [multi-temporal semantic segmentation](#multi-temporal-semantic-segmentation)
- [change detection](#change-detection)
- [single temporal regression](#single-temporal-regression)
- [multi-temporal regression](#multi-temporal-regression)

It is possible also to train some [supervised baselines](#-fully-supervised-training), based on UNet.

## 🛠️ Setup
Clone the repository:
Expand All @@ -68,16 +72,25 @@ mamba activate geofm-bench8
```

## 🏋️ Training

There are 5 basic component types in our config system:
- `config`: Information of training settings such as batch size, epochs, use wandb. `limited_label` is to indicate the percentage of dataset used for training, for example, `-1` means the full training dataset is used while `0.5` means 50% used.

- `config`: Information of training settings such as batch size, epochs, use wandb. `limited_label` is to indicate the percentage of dataset used for training, for example, `-1` means the full training dataset is used while `0.5` means 50% used. #strategy used
- `encoder_config`: GFM encoder related parameters. `output_layers` is used for which layers are used for Upernet decoder.
- `dataset_config`: Information of downstream datasets such as image size, band_statistics, etc.
- `segmentor_config`: Downstream task decoder fine-tuning related parameters, including the head type, loss, optimizer, scheduler, etc.
- `augmentation_config`: Both preprocessing and augmentations steps required for the dataset, such as bands adaptation, normalization, resize/crop.

We provide several examples of command lines to initilize different training tasks on single gpu.
We provide several examples of command lines to initilize different training tasks on single GPU.

Please note:
- Command line's parameters have the priority on the parameters in the config files. So, if you want to change e.g. the `batch size`, without changing the `config`, you can just add `--batch size n` to the command line
- To use more gpus or nodes, set `--nnodes` and `--nproc_per_node` correspondingly, see:
https://pytorch.org/docs/stable/elastic/run.html
- To use mixed precision training, specify either `--fp16` for float16 and or `--bf16` for bfloat16

### 💻 Decoder Finetuning
**Single Temporal Semantic Segmentation**
#### Single Temporal Semantic Segmentation

Take MADOS dataset, Prithvi Encoder and Upernet Decoder as example:
```
Expand All @@ -90,7 +103,7 @@ torchrun --nnodes=1 --nproc_per_node=1 run.py \
--num_workers 4 --eval_interval 1 --use_wandb
```

**Multi Temporal Semantic Segmentation**
#### Multi-Temporal Semantic Segmentation

Multi-temporal model `configs/segmentors/upernet_mt.yaml` should be used. In addition, in the dataset config, indicate the number of time frames, e.g., `multi_temporal: 6`
```
Expand All @@ -103,56 +116,93 @@ torchrun --nnodes=1 --nproc_per_node=1 run.py \
--num_workers 4 --eval_interval 1 --use_wandb
```

**Change Detection**
#### Change Detection
```
torchrun ...
```
#### Single Temporal Regression
```
torchrun ...
```

**Multi Temporal Regression**
#### Multi-Temporal Regression
```
torchrun ...
```

### 💻 Fully Supervised Training
**Single Temporal Change Detection**
#### Single Temporal Semantic Segmentation
```
torchrun ...
```
In general

To use more gpus or nodes, set `--nnodes` and `--nproc_per_node` correspondingly, see:
https://pytorch.org/docs/stable/elastic/run.html

To use mixed precision training, specify either `--fp16` for float16 and or `--bf16` for bfloat16

## 🏃 Evaluation
Indicate the `eval_dir` where the checkpoints and configurations are stored.

```
torchrun --nnodes=1 --nproc_per_node=1 run.py --batch_size 1 --eval_dir work-dir/the-folder-where-your-exp-is-saved
```

## ✏️ Contributing
We appreciate all contributions to improve xxx. Please refer to [Contributing Guidelines](.github/CONTRIBUTING.md)
We appreciate all contributions. Please refer to [Contributing Guidelines](.github/CONTRIBUTING.md)

## ⚠️ Warnings

Some features are under construction:
- the automatic download is working for all the datasets and models' weights but, respectively, **Five Billion Pixels** and **GFM**.
- the automatic download is working for all the datasets and models' weights but, respectively, **Five Billion Pixels**, **BioMassters**, and **GFM**.


## 🧮 Some first results

A pre-print is coming soon... Stay tuned to read it
A pre-print is coming soon... Stay tuned!

| Encoder | Dataset | Epochs | mIoU |
|---------|--------------|--------|--------|
| Prithvi | MADOS | 80 | 53.455 |
| Prithvi | HLSBurnScars | 80 | 86.208 |
| Prithvi | Sen1Floods11 | 80 | 87.217 |

Please note:
Please note: #add different conditions

## 💡 Acknowledgements

## ©️ License

MIT License

Copyright (c) Microsoft Corporation.

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE

## 📝 Citing

If you use this software in your work, please cite:

```
@misc{pangaea,
author = {},
title = {Pangaea},
year = {2024},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/yurujaja/geofm-bench}},
}
```

0 comments on commit 5d07681

Please sign in to comment.