Skip to content

setting the pad_token of tokenizer & use AriaProcessor during evaluation & fix: use length of input_ids find the output slice instead of input_string #18

setting the pad_token of tokenizer & use AriaProcessor during evaluation & fix: use length of input_ids find the output slice instead of input_string

setting the pad_token of tokenizer & use AriaProcessor during evaluation & fix: use length of input_ids find the output slice instead of input_string #18