This project is a dictionary based lemmatizer written in go. Requires git lfs for large dictionary files.
A lemmatizer is a tool that finds the base form of words.
Lang | Input | Output |
---|---|---|
English | aligning | align |
Swedish | sprungit | springa |
French | abattaient | abattre |
It's based on the dictionaries found on michmech/lemmatization-lists, which are available under the Open Database License. This project would not be feasible without them.
At the moment golem supports English, Swedish, French, Spanish, Italian & German, but adding another language should be no more trouble than getting the dictionary for that language. Some of which are already available on lexiconista. Please let me know if there is something you would like to see in here, or fork the project and create a pull request.
package main
import (
"github.com/aaaton/golem"
"github.com/aaaton/golem/dicts/en"
)
func main() {
// the language packages are available under golem/dicts
// "en" is for english
lemmatizer, err := golem.New(en.New())
if err != nil {
panic(err)
}
word := lemmatizer.Lemma("Abducting")
if word != "abduct" {
panic("The output is not what is expected!")
}
}
To regenerate the files, run make all
. This requires go-bindata to be installed.
- axamon
- charlesgiroux
- glaslos