From aae54531c4f16d1898a3098e6b7eefb60471381d Mon Sep 17 00:00:00 2001 From: Nathan Michlo Date: Sat, 5 Jun 2021 00:13:46 +0200 Subject: [PATCH] version bump v0.0.1.dev14 Update README.md updated .gitignore --- .gitignore | 1 + README.md | 59 ++++++++++++++++++++++++++++++++++-------------------- setup.py | 2 +- 3 files changed, 39 insertions(+), 23 deletions(-) diff --git a/.gitignore b/.gitignore index 7f3c124b..8e98a682 100644 --- a/.gitignore +++ b/.gitignore @@ -135,4 +135,5 @@ logs/ # custom - root folder only /data/dataset +/docs/examples/data *.pkl diff --git a/README.md b/README.md index 5f0120e6..f80844d6 100644 --- a/README.md +++ b/README.md @@ -39,6 +39,22 @@ ---------------------- +### Table Of Contents + +- [Overview](#overview) +- [Getting Started](#getting-started) +- [Features](#features) + * [Frameworks](#frameworks) + * [Metrics](#metrics) + * [Datasets](#datasets) + * [Schedules & Annealing](#schedules--annealing) +- [Examples](#examples) + * [Python Example](#python-example) + * [Hydra Config Example](#hydra-config-example) +- [Why?](#why) + +---------------------- + ### Overview Disent is a modular disentangled representation learning framework for auto-encoders, built upon pytorch-lightning. This framework consists of various composable components that can be used to build and benchmark disentanglement pipelines. @@ -159,7 +175,7 @@ add your own, or you have a request.

-#### Datasets: +#### Datasets Various common datasets used in disentanglement research are implemented, as well as new sythetic datasets that are generated programatically on the fly. These are convenient and lightweight, not requiring storage space. @@ -181,7 +197,7 @@ Various common datasets used in disentanglement research are implemented, as wel - Input based transforms are supported. - Input and Target CPU and GPU based augmentations are supported. -#### Schedules/Annealing: +#### Schedules & Annealing Hyper-parameter annealing is supported through the use of schedules. The currently implemented schedules include: @@ -192,21 +208,6 @@ Hyper-parameter annealing is supported through the use of schedules. The current ---------------------- -### Why? - -- Created as part of my Computer Science MSc scheduled for completion in 2021. - -- I needed custom high quality implementations of various VAE's. - -- A pytorch version of [disentanglement_lib](https://github.com/google-research/disentanglement_lib). - -- I didn't have time to wait for [Weakly-Supervised Disentanglement Without Compromises](https://arxiv.org/abs/2002.02886) to release - their code as part of disentanglement_lib. (As of September 2020 it has been released, but has unresolved [discrepencies](https://github.com/google-research/disentanglement_lib/issues/31)). - -- disentanglement_lib still uses outdated Tensorflow 1.0, and the flow of data is unintuitive because of its use of [Gin Config](https://github.com/google/gin-config). - ----------------------- - ### Architecture **disent** @@ -225,7 +226,9 @@ Hyper-parameter annealing is supported through the use of schedules. The current ---------------------- -### Example Code +### Examples + +#### Python Example The following is a basic working example of disent that trains a BetaVAE with a cyclic beta schedule and evaluates the trained model with various metrics. @@ -299,10 +302,7 @@ print('metrics:', metrics) Visit the [docs](https://disent.dontpanic.sh) for more examples! - ----------------------- - -### Hydra Experiment Example +#### Hydra Config Example The entrypoint for basic experiments is `experiments/run.py`. @@ -342,3 +342,18 @@ change the dataset from `xysquares` to `shapes3d`. `run_logging: wandb`. However, you will need to login from the command line. ---------------------- + +### Why? + +- Created as part of my Computer Science MSc scheduled for completion in 2021. + +- I needed custom high quality implementations of various VAE's. + +- A pytorch version of [disentanglement_lib](https://github.com/google-research/disentanglement_lib). + +- I didn't have time to wait for [Weakly-Supervised Disentanglement Without Compromises](https://arxiv.org/abs/2002.02886) to release + their code as part of disentanglement_lib. (As of September 2020 it has been released, but has unresolved [discrepencies](https://github.com/google-research/disentanglement_lib/issues/31)). + +- disentanglement_lib still uses outdated Tensorflow 1.0, and the flow of data is unintuitive because of its use of [Gin Config](https://github.com/google/gin-config). + +---------------------- diff --git a/setup.py b/setup.py index 0dc38e5c..001bab23 100644 --- a/setup.py +++ b/setup.py @@ -48,7 +48,7 @@ author="Nathan Juraj Michlo", author_email="NathanJMichlo@gmail.com", - version="0.0.1.dev13", + version="0.0.1.dev14", python_requires=">=3.8", packages=setuptools.find_packages(),