Skip to content

Latest commit

 

History

History
28 lines (18 loc) · 2.36 KB

README.md

File metadata and controls

28 lines (18 loc) · 2.36 KB

DynamicHMCExamples.jl

lifecycle build codecov.io Documentation

This repository contains examples for my the libraries related Bayesian inference I maintain. Install this with

pkg> add https://github.com/tpapp/DynamicHMCExamples.jl

which will download a working and tested set of versions. Optionally, you can pkg> up after this.

The examples are in the src/ directory as Julia source files, marked up for Literate.jl. This means that they can be executed directly, but they are also available as webpages.

Note that this is not an introduction to Bayesian inference, merely an implementation in Julia using a certain approach that I find advantageous. The focus is on coding the (log) posterior as a function, then passing this to a modern Hamiltonian Monte Carlo sampler (a variant of NUTS, as described in Betancourt (2017).

The advantage of this approach is that you can debug, benchmark, and optimize your posterior calculations directly using the tools in Julia, like any other Julia code. In contrast to other libraries,

  1. you don't need to use a DSL,
  2. you are not formulating your model as a directed acyclic graph,
  3. and you can calculate some are all derivatives manually.

The implicit requirement for this approach is of course that you need to understand how to translate your model to a posterior function and code it in Julia.

The examples show how to do transformations and automatic differentiation with related libraries that wrap a log posterior function. However, if you prefer, you can use other approaches, such as manually coding the transformations or symbolic differentiation.