Skip to content

ZeguanXiao/BERT4GCN_lightning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BERT4GCN

PyTorch Lightning Config: Hydra Template

Description

Official implementation of EMNLP 2021 paper BERT4GCN: Using BERT Intermediate Layers to Augment GCN for Aspect-based Sentiment Classification

How to run

Install dependencies

# clone project
git clone https://github.com/ZeguanXiao/BERT4GCN_lightning
cd BERT4GCN_lightning

# [OPTIONAL] create conda environment
conda create -n bert4gcn python=3.8
conda activate bert4gcn

# install pytorch according to instructions
# https://pytorch.org/get-started/

# install requirements
bash sctipts/build_env.sh

Download glove.840B.300d.zip and unzip to glove/

Parse dependency graph

bash scripts/preprocess.sh

Train model with chosen experiment configuration from configs/experiment/

bash scripts/schedule.sh

Here are experiments with above code.

Citation

If the code is used in your research, please star our repo and cite our paper as follows:

@inproceedings{xiao-etal-2021-bert4gcn,
    title = "{BERT}4{GCN}: Using {BERT} Intermediate Layers to Augment {GCN} for Aspect-based Sentiment Classification",
    author = "Xiao, Zeguan  and
      Wu, Jiarun  and
      Chen, Qingliang  and
      Deng, Congjian",
    booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
    month = nov,
    year = "2021",
    address = "Online and Punta Cana, Dominican Republic",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2021.emnlp-main.724",
    doi = "10.18653/v1/2021.emnlp-main.724",
    pages = "9193--9200",
    abstract = "Graph-based Aspect-based Sentiment Classification (ABSC) approaches have yielded state-of-the-art results, expecially when equipped with contextual word embedding from pre-training language models (PLMs). However, they ignore sequential features of the context and have not yet made the best of PLMs. In this paper, we propose a novel model, BERT4GCN, which integrates the grammatical sequential features from the PLM of BERT, and the syntactic knowledge from dependency graphs. BERT4GCN utilizes outputs from intermediate layers of BERT and positional information between words to augment GCN (Graph Convolutional Network) to better encode the dependency graphs for the downstream classification. Experimental results demonstrate that the proposed BERT4GCN outperforms all state-of-the-art baselines, justifying that augmenting GCN with the grammatical features from intermediate layers of BERT can significantly empower ABSC models.",
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published