Sequence Relation Classification with Transformers.
As prerequisite, you need installations of Python 3.6+, PyTorch 1.0.0+ and TensorFlow 2.0.0-rc1.
Clone the repository; ideally in a Python virtual environment. All dependencies can be installed via:
pip install -r requirements.txt
The data is split into training, test, and development sets.
Run the following command to fine-tune a BERTBASE model on a sequence classification task.
python sequences-trainer.py \
--model_type bert \
--model_name_or_path bert-base-uncased \
--task_name seq-classification \
--do_train --do_eval \
--data_dir data/ \
--max_seq_length 20 --per_gpu_train_batch_size 4 \
--learning_rate 2e-5 --num_train_epochs 20.0 \
--output_dir gens/ \
--eval_all_checkpoints \
--overwrite_output_dir \
--tokenizer_name bert-base-uncased \
--do_lower_case
This project is licensed under the APACHE LICENSE, VERSION 2.0 LICENSE.md file for details.
The code is based on the original implementations provided by huggingface transformers