You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, the framework does not seem to allow for batching inputs, which reduce its usability on larger datasets.
The intuitive approach would be:
from danlp.models import load_bert_tone_model
classifier = load_bert_tone_model()
classifier.predict(["I am very happy", "I am very very happy"])
# {'analytic': 'objective', 'polarity': 'positive'}
While you would expect:
from danlp.models import load_bert_tone_model
classifier = load_bert_tone_model()
classifier.predict(["I am very happy", "I am very very happy"], batch_size=2)
#[{'analytic': 'objective', 'polarity': 'positive'},
# {'analytic': 'objective', 'polarity': 'positive'}]
The reason why you would add the batch_size=2 is to distinguish between looping through each text and batching them for faster computation on GPUs
Interestingly the first approach does not throw an error
The text was updated successfully, but these errors were encountered:
Currently, the framework does not seem to allow for batching inputs, which reduce its usability on larger datasets.
The intuitive approach would be:
While you would expect:
The reason why you would add the
batch_size=2
is to distinguish between looping through each text and batching them for faster computation on GPUsInterestingly the first approach does not throw an error
The text was updated successfully, but these errors were encountered: