Skip to content

End-to-end deep learning predictive modeling package for time-series data.

License

Notifications You must be signed in to change notification settings

Kevin-Chen0/dnn-time-series

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Deep Neural Networks (DNN) for time-series data

Turnkey and modular deep learning predictive modeling package for time-series data. It allows for univariate and multivariate time-series as well as single and multi-step forecasts. DNN models include RNNs, LSTMs, GRUs, CNNs, hybrids, and more.

Quick start

Step 1) Create and activate new env using pipenv or conda with Python 3.6 or higher. Here, the env is named dts.

conda create -n dts python=3.6
conda activate dts

Step 2) Pip install dnntime package. It will automatically install or update the dependent packages.

pip install dnntime

Step 3) In your working directory, download the example directory from this repo and cd into it.

svn export https://github.com/Kevin-Chen0/dnn-time-series.git/trunk/example
cd example

Step 4) To run it locally, open local_run.ipynb and proceed to run all. It will use local_config.yaml as parameters to customize the procedures at runtime. It double-checks whether you installed the latest dnntime (v0.3.9.3) and will install for you if not. Make sure that you have set the dataset file path in local_config.yaml and the local_config.yaml path in local_run.ipynb.

NOTE: It is highly recommended to run this package using a GPU. Although CPU may work on small-scale datasets of < 10,000 samples, it may encounter performance issues on any dataset larger than that, including the example datasets found here. If you do not have a GPU, you can skip Step 4) and move to Step 5) Google Colab.

Step 5) To run it on Google Colab, first make sure that you have a Google account and is logged in. Then go to the Colab page and upload colab_run.ipynb, colab_config.yaml, and an time-series example dataset. Those files will actually be stored onto your Google Drive upon uploading into Colab. Next, copy the sharable links and the colab_config.yaml and given dataset files and add them onto colab_run.ipynb. Finally, in the colab_run.ipynb notebook, set the runtime type as GPU and run all.

NOTE: You can copy this notebook to run on any cloud notebook as long as you can customize how to store and extract the files from that cloud instance.

About

End-to-end deep learning predictive modeling package for time-series data.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published