This repository holds the source code for all experiments presented in: https://arxiv.org/abs/1710.09431 - On the Long-Term Memory of Deep Recurrent Networks.
- tensorflow
- numpy
- sklearn
- argparse
- The code of scoRNN.py is taken from https://github.com/SpartinStuff/scoRNN. We strongly recommend you to read their paper - "Orthogonal Recurrent Neural Networks with Scaled Cayley Transform" - https://arxiv.org/abs/1707.09520.
Our results can be reproduced by running each experiment k times with the k appropriate configurations. Command line examples are presented below - one configuration for each experiment.
$cd copying_memory_task
$python run_copying_task.py -num_iters 50000 -rnn_depth 2 -rnn_hidden_dim 64 -B 10
$cd start_end_similarity
$python run_SES.py -num_iters 50000 -rnn_depth 2 -rnn_hidden_dim 64 -T 80
$cd sequential_MNIST
$python run_seq_MNIST.py -permute 1 -num_iters 50000 -rnn_depth 2 -rnn_hidden_dim 64