Skip to content

Code for the paper "Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks"

License

Notifications You must be signed in to change notification settings

cpllab/Ordered-Neurons

 
 

Repository files navigation

ON-LSTM

This repository contains the code used for word-level language model and unsupervised parsing experiments in Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks paper, originally forked from the LSTM and QRNN Language Model Toolkit for PyTorch. If you use this code or our results in your research, we'd appreciate if you cite our paper as following:

@article{shen2018ordered,
  title={Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks},
  author={Shen, Yikang and Tan, Shawn and Sordoni, Alessandro and Courville, Aaron},
  journal={arXiv preprint arXiv:1810.09536},
  year={2018}
}

Software Requirements

Python 3.6, NLTK and PyTorch 0.4 are required for the current codebase.

Steps

  1. Install PyTorch 0.4 and NLTK

  2. Download PTB data. Note that the two tasks, i.e., language modeling and unsupervised parsing share the same model strucutre but require different formats of the PTB data. For language modeling we need the standard 10,000 word Penn Treebank corpus data, and for parsing we need Penn Treebank Parsed data.

  3. Scripts and commands

    • Train Language Modeling python main.py --batch_size 20 --dropout 0.45 --dropouth 0.3 --dropouti 0.5 --wdrop 0.45 --chunk_size 10 --seed 141 --epoch 1000 --data /path/to/your/data

    • Test Unsupervised Parsing python test_phrase_grammar.py --cuda

    The default setting in main.py achieves a perplexity of approximately 56.17 on PTB test set and unlabeled F1 of approximately 47.7 on WSJ test set.

About

Code for the paper "Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 66.8%
  • C 31.9%
  • Scilab 1.2%
  • Other 0.1%