Skip to content

TTTTTTris/ztw_nlp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

Source Code for natural language processing experiments for "Zero Time Waste: Recycling Predictions in Early Exit Neural Networks"

Extension of Zero Time Waste model to NLP.

Setup

  1. Create and activate conda environment:
conda env create -f environment.yml
conda activate ztw_nlp
  1. Set up a wandb.ai account and create a project.
  2. Add your API keys by running wandb login as described here.
  3. Create .env file containing wandb entity and project names, following the example below:
WANDB_ENTITY=<entity>
WANDB_PROJECT=<project_name>

Experiments (baseline)

Finetune base BERT models on GLUE tasks

./scripts/base_bert/finetune_all_models.sh

Train ZTW, SDN and PABEE models on single task

./scripts/ee/train_single_task.sh <task_name> <seed> <main_lr> <ensemble_lr>

Reproduce experiments from paper

./scripts/reproduce_main_experiments.sh

Process the results from wandb to the format in paper

  1. Generate file with flops mapping:
PYTHONPATH=$PYTHONPATH:. python scripts/generate_flops_mapping.py
  1. Generate csv file with results for given task and latex code from wandb runs - example for a RTE model:
PYTHONPATH=$PYTHONPATH:. python scripts/generate_final_table.py \
  --task RTE \
  --wandb_tag RTE_final \
  --flops_mapping_path results/flops_mapping.json \
  --output_path results/rte.csv

Experiments (EE-binary-BERT)

./scripts/bagging_ztw.sh TASK_NAME

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published