Releases: huggingface/optimum-graphcore
Releases · huggingface/optimum-graphcore
v0.3.1: Wave2Vec2, ConvNeXT and BART update
v0.3.0: Training transformers models on IPUs
This release can be considered the first official optimum-graphcore release.
It provides:
- A IPU specific config class,
IPUConfig
, enabling the user to both specify various parameters concerning the IPU and share it on the HuggingFace Hub - A custom trainer class,
IPUTrainer
, making training on IPUs seemless for the user compared to the regular transformers Trainer - A set of example scripts for all the supported tasks
Supported Architectures
The following model architectures can be trained and evaluated using the IPUTrainer
:
- Bert
- Roberta
- Deberta
- Lxmert
- Hubert
- ViT
Training only
The following model architectures can be trained (and evaluated without generation):
- Bart
- T5
- GPT-2
Training Scripts
- Language modeling
- Text classification
- Token classification
- Multiple choice
- Question answering
- Summarization
- Translation
- Audio classification
- Image classification