Skip to content

Commit

Permalink
version bump v0.0.1.dev14
Browse files Browse the repository at this point in the history
Update README.md
updated .gitignore
  • Loading branch information
nmichlo committed Jun 4, 2021
1 parent e15fb4a commit aae5453
Show file tree
Hide file tree
Showing 3 changed files with 39 additions and 23 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -135,4 +135,5 @@ logs/

# custom - root folder only
/data/dataset
/docs/examples/data
*.pkl
59 changes: 37 additions & 22 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,22 @@

----------------------

### Table Of Contents

- [Overview](#overview)
- [Getting Started](#getting-started)
- [Features](#features)
* [Frameworks](#frameworks)
* [Metrics](#metrics)
* [Datasets](#datasets)
* [Schedules & Annealing](#schedules--annealing)
- [Examples](#examples)
* [Python Example](#python-example)
* [Hydra Config Example](#hydra-config-example)
- [Why?](#why)

----------------------

### Overview

Disent is a modular disentangled representation learning framework for auto-encoders, built upon pytorch-lightning. This framework consists of various composable components that can be used to build and benchmark disentanglement pipelines.
Expand Down Expand Up @@ -159,7 +175,7 @@ add your own, or you have a request.

</p></details>

#### Datasets:
#### Datasets

Various common datasets used in disentanglement research are implemented, as well as new sythetic datasets that are generated programatically on the fly. These are convenient and lightweight, not requiring storage space.

Expand All @@ -181,7 +197,7 @@ Various common datasets used in disentanglement research are implemented, as wel
- Input based transforms are supported.
- Input and Target CPU and GPU based augmentations are supported.

#### Schedules/Annealing:
#### Schedules & Annealing

Hyper-parameter annealing is supported through the use of schedules. The currently implemented schedules include:

Expand All @@ -192,21 +208,6 @@ Hyper-parameter annealing is supported through the use of schedules. The current

----------------------

### Why?

- Created as part of my Computer Science MSc scheduled for completion in 2021.

- I needed custom high quality implementations of various VAE's.

- A pytorch version of [disentanglement_lib](https://github.com/google-research/disentanglement_lib).

- I didn't have time to wait for [Weakly-Supervised Disentanglement Without Compromises](https://arxiv.org/abs/2002.02886) to release
their code as part of disentanglement_lib. (As of September 2020 it has been released, but has unresolved [discrepencies](https://github.com/google-research/disentanglement_lib/issues/31)).

- disentanglement_lib still uses outdated Tensorflow 1.0, and the flow of data is unintuitive because of its use of [Gin Config](https://github.com/google/gin-config).

----------------------

### Architecture

**disent**
Expand All @@ -225,7 +226,9 @@ Hyper-parameter annealing is supported through the use of schedules. The current

----------------------

### Example Code
### Examples

#### Python Example

The following is a basic working example of disent that trains a BetaVAE with a cyclic
beta schedule and evaluates the trained model with various metrics.
Expand Down Expand Up @@ -299,10 +302,7 @@ print('metrics:', metrics)

Visit the [docs](https://disent.dontpanic.sh) for more examples!


----------------------

### Hydra Experiment Example
#### Hydra Config Example

The entrypoint for basic experiments is `experiments/run.py`.

Expand Down Expand Up @@ -342,3 +342,18 @@ change the dataset from `xysquares` to `shapes3d`.
`run_logging: wandb`. However, you will need to login from the command line.

----------------------

### Why?

- Created as part of my Computer Science MSc scheduled for completion in 2021.

- I needed custom high quality implementations of various VAE's.

- A pytorch version of [disentanglement_lib](https://github.com/google-research/disentanglement_lib).

- I didn't have time to wait for [Weakly-Supervised Disentanglement Without Compromises](https://arxiv.org/abs/2002.02886) to release
their code as part of disentanglement_lib. (As of September 2020 it has been released, but has unresolved [discrepencies](https://github.com/google-research/disentanglement_lib/issues/31)).

- disentanglement_lib still uses outdated Tensorflow 1.0, and the flow of data is unintuitive because of its use of [Gin Config](https://github.com/google/gin-config).

----------------------
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@
author="Nathan Juraj Michlo",
author_email="[email protected]",

version="0.0.1.dev13",
version="0.0.1.dev14",
python_requires=">=3.8",
packages=setuptools.find_packages(),

Expand Down

0 comments on commit aae5453

Please sign in to comment.