Skip to content

Commit

Permalink
Merge pull request #4 from kdgutier/pip
Browse files Browse the repository at this point in the history
Pip
  • Loading branch information
AzulGarza authored Apr 22, 2020
2 parents 268f24f + e0c0455 commit d5a88de
Show file tree
Hide file tree
Showing 3 changed files with 31 additions and 9 deletions.
Binary file added .github/images/metrics.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
38 changes: 30 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,10 @@
[![Build](https://github.com/kdgutier/esrnn_torch/workflows/Python%20package/badge.svg?branch=pip)](https://github.com/kdgutier/esrnn_torch/tree/pip)
[![PyPI version fury.io](https://badge.fury.io/py/ESRNN.svg)](https://pypi.python.org/pypi/ESRNN/)
[![Downloads](https://pepy.tech/badge/esrnn)](https://pepy.tech/project/esrnn)
[![Python 3.6+](https://img.shields.io/badge/python-3.6+-blue.svg)](https://www.python.org/downloads/release/python-360+/)
[![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](https://github.com/kdgutier/esrnn_torch/blob/master/LICENSE)


# Pytorch Implementation of the ES-RNN
In this project we coded a pytorch class for the ES-RNN algorithm proposed by Smyl, winning submission of the M4 Forecasting Competition. The class wraps fit and predict methods to facilitate interaction with Machine Learning pipelines along with evaluation and data wrangling utility.

Expand All @@ -13,7 +15,8 @@ In this project we coded a pytorch class for the ES-RNN algorithm proposed by Sm

## Installation

The source code is currently hosted on GitHub at: https://github.com/kdgutier/esrnn_torch
This code is a work in progress, any contributions or issues are welcome on
GitHub at: https://github.com/kdgutier/esrnn_torch

You can install the *released version* of `ESRNN` from the [Python package index](https://pypi.org) with:

Expand All @@ -22,25 +25,36 @@ pip install ESRNN
```

## Usage Example
Make sure on use of the model that the dataframes to fit satisfy being **balanced**,
and there are **no zeros** at the beginning of the series, there are **no negative values**, since that
has bad interactions with the multiplicative model.

```python
from ESRNN.m4_data import prepare_m4_data
from ESRNN.utils_evaluation import evaluate_prediction_owa

from ESRNN import ESRNN

X_train_df, y_train_df, X_test_df, y_test_df = prepare_m4_data(dataset_name='Yearly', directory = './data', num_obs=1000)
X_train_df, y_train_df, X_test_df, y_test_df = prepare_m4_data(dataset_name='Yearly',
directory = './data',
num_obs=1000)

# Instantiate model
model = ESRNN(max_epochs=25, freq_of_test=5, batch_size=4, learning_rate=1e-4, per_series_lr_multip=0.8,
lr_scheduler_step_size=10, lr_decay=0.1, gradient_clipping_threshold=50,
model = ESRNN(max_epochs=25, freq_of_test=5, batch_size=4, learning_rate=1e-4,
per_series_lr_multip=0.8, lr_scheduler_step_size=10,
lr_decay=0.1, gradient_clipping_threshold=50,
rnn_weight_decay=0.0, level_variability_penalty=100,
testing_percentile=50, training_percentile=50,
ensemble=False, max_periods=25, seasonality=[], input_size=4, output_size=6,
cell_type='LSTM', state_hsize=40, dilations=[[1], [6]], add_nl_layer=False,
ensemble=False, max_periods=25, seasonality=[],
input_size=4, output_size=6,
cell_type='LSTM', state_hsize=40,
dilations=[[1], [6]], add_nl_layer=False,
random_seed=1, device='cpu')

# Fit model
# If y_test_df is provided the model will evaluate predictions on this set every freq_test epochs
# If y_test_df is provided the model
# will evaluate predictions on
# this set every freq_test epochs
model.fit(X_train_df, y_train_df, X_test_df, y_test_df)

# Predict on test set
Expand All @@ -51,6 +65,13 @@ final_owa, final_mase, final_smape = evaluate_prediction_owa(y_hat_df, y_train_d
X_test_df, y_test_df,
naive2_seasonality=1)
```
## Overall Weighted Average

A metric that is useful for quantifying the aggregate error of a specific model for various time series is the Overall Weighted Average (OWA) proposed for the M4 competition. This metric is calculated by obtaining the average of the symmetric mean absolute percentage error (sMAPE) and the mean absolute scaled error (MASE) for all the time series of the model and also calculating it for the Naive2 predictions. Both sMAPE and MASE are scale independent. These measurements are calculated as follows:

![OWA](.github/images/metrics.png)



## Current Results
Here we used the model directly to compare to the original implementation. It is worth noticing that these results do not include the ensemble methods mentioned in the [ESRNN paper](https://www.sciencedirect.com/science/article/pii/S0169207019301153).<br/>
Expand All @@ -73,7 +94,8 @@ Here we used the model directly to compare to the original implementation. It is
Replicating the M4 results is as easy as running the following line of code (for each frequency) after installing the package via pip:

```console
python -m ESRNN.m4_run --dataset 'Yearly' --results_directory '/some/path' --gpu_id 0 --use_cpu 0
python -m ESRNN.m4_run --dataset 'Yearly' --results_directory '/some/path' \
--gpu_id 0 --use_cpu 0
```

Use `--help` to get the description of each argument:
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@

setuptools.setup(
name="ESRNN",
version="0.1.0",
version="0.1.1",
author="Kin Gutierrez, Cristian Challu, Federico Garza",
author_email="[email protected], [email protected], [email protected]",
description="Pytorch implementation of the ESRNN",
Expand Down

0 comments on commit d5a88de

Please sign in to comment.