Skip to content

Dutch UD syntax model, 12 layers, 384 hidden units, ALBERT architecture

Compare
Choose a tag to compare
@danieldk danieldk released this 28 Jul 17:50
· 1 commit to main since this release

Dutch UD syntax model:

  • Universal tags
  • Lemmas
  • Morphology
  • Universal dependencies

Distilled from finetuned XLM-RoBERTa large, into a transformer with 12 hidden layers, 384 hidden units, and 12 attention heads. The model uses the ALBERT architecture with 6 layer groups and 128 dimensional piece embeddings.