Skip to content

Reliable Categorical Variational Inference with Mixture of Discrete Normalizing Flows

Notifications You must be signed in to change notification settings

tkusmierczyk/mixture_of_discrete_normalizing_flows

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Reliable Categorical Variational Inference with Mixture of Discrete Normalizing Flows

Quick start

The most basic but fully-functional implementation of mixture of discrete flows (assuming factorized posterior) can be previewed in flows_factorized_mixture.py. The implementation is used by VAEFlowsBasic.ipynb - a demonstration of MDNF for VAE (amortized inference example).

Publication

The code was used and is necessary to reproduce results from the paper:

T. Kuśmierczyk, A. Klami: Reliable Categorical Variational Inference with Mixture of Discrete Normalizing Flows (arXiv preprint)

Code organization

The code was implemented to support maximum flexibility via modularization, which makes various experiments possible, but at the cost of sacrificing efficiency. It separates the construction of individual flows from the construction of base distributions, from the creation of mixtures and inference algorithms. An exception is the flows_factorized_mixture.py that includes sampling from the base, location-shift transformation, and probability evaluation altogether in a single file.

Description of files

  1. notebooks - Jupyter notebooks illustrating use of MDNF with various models (the same code used later for experiments):
  1. mdnf - main files implementing flows, mixtures, base distributions, inference etc.:

Specification of dependencies

Please start by installing requirements.txt. In case of problems consult the following:

The code was tested with Python 3.7.4 (on a Linux platform), using tensorflow 2.2.0 and tensorflow_probability 0.9.0 (can be installed with pip install tensorflow==2.2.0 tensorflow_probability=0.9.0). It also requires numpy, pandas, sklearn and scipy, that can be installed with pip install numpy pandas sklearn scipy, but are also available by default in for example, python Anaconda distributions. Potential problems with scipy 1.4.1 can be solved by downgrading it to version 1.2.1 with pip install scipy==1.2.1.

Notebooks .ipynb can be previewed using Jupyter Notebook and run from a command line with runipy. Visualizing results requires matplotlib and seaborn to be available (pip install matplotlib seaborn).

Parts of the code for Bayesian networks require PGMPY (pip install pgmpy==0.1.10) and code for Gaussian mixture models builds on Python codes implementing algorithms described in Bishop's book (can by installed with git clone https://github.com/ctgk/PRML; cd PRML; python setup.py install).

Finally, the code comparing partial and location-scale flows uses Edward2 that can be installed with pip install "git+https://github.com/google/edward2.git#egg=edward2"

About

Reliable Categorical Variational Inference with Mixture of Discrete Normalizing Flows

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published