Idea came from reviews:
-
Download model binary file from link
-
Download tokenizer from link
-
Unzip both folders in main directory
-
Entry point src/api
{
"response": {
"prediction": {
"input": "El servicio muy lento. El celular por otro lado se traba todo el tiempo",
"prediction": "App and Service",
"prediction values": {
"app": -2.71728515625,
"app and service": 5.459717273712158,
"other": -2.654169797897339,
"service": 0.3330422639846802
}
}
}
}