From 239af2f1b488125e5c1fc7967691a744018f99ee Mon Sep 17 00:00:00 2001 From: Xiaoyan Wang Date: Thu, 11 Jan 2024 16:18:57 -0500 Subject: [PATCH] Sync the changes in from probcomp/gen-quickstart/pull/90 --- _posts/2020-09-01-bounds.md | 2 +- ecosystem.md | 2 +- tutorials/data-driven-proposals/tutorial.md | 8 ++++---- tutorials/intro-to-modeling/tutorial.md | 2 +- tutorials/iterative-inference/tutorial.md | 2 +- tutorials/particle-filtering/tutorial.md | 12 ++++++------ tutorials/regenerate/tutorial.md | 6 +++--- tutorials/rj/tutorial.md | 2 +- tutorials/scaling-with-combinators-new/tutorial.md | 6 +++--- 9 files changed, 21 insertions(+), 21 deletions(-) diff --git a/_posts/2020-09-01-bounds.md b/_posts/2020-09-01-bounds.md index 920bf8f4a..5bca0f00c 100644 --- a/_posts/2020-09-01-bounds.md +++ b/_posts/2020-09-01-bounds.md @@ -82,7 +82,7 @@ This result seems to make sense qualitatively. ## Applying variational inference Let's now try to apply variational inference to this problem. -We will use Gen's support for [black box variational inference](https://www.gen.dev/dev/ref/vi/#Black-box-variational-inference-1), which is a class of algorithms introduced by Rajesh Ranganath et al. in a [2013 paper](https://arxiv.org/abs/1401.0118) that requires only the ability to evaluate the unnormalized log probability density of the model. +We will use Gen's support for [black box variational inference](https://www.gen.dev/docs/dev/ref/vi/#Black-box-variational-inference-1), which is a class of algorithms introduced by Rajesh Ranganath et al. in a [2013 paper](https://arxiv.org/abs/1401.0118) that requires only the ability to evaluate the unnormalized log probability density of the model. Gen lets you apply black box variational inference using variational approximating families that are themselves defined as probabilistic programs. The first step is to write the probabilistic program that defines the variational approximating family that we will optimize to match the posterior as closely as possible. diff --git a/ecosystem.md b/ecosystem.md index affe87b9e..2a8572aa7 100644 --- a/ecosystem.md +++ b/ecosystem.md @@ -54,4 +54,4 @@ Probability distributions and involutive MCMC kernels on orientations and rotati Wrapper for employing the [Redner](https://github.com/BachiLi/redner) differentiable renderer in Gen generative models. #### [GenTraceKernelDSL](https://github.com/probcomp/GenTraceKernelDSL.jl) -An alternative interface to defining [trace translators](https://www.gen.dev/dev/ref/trace_translators/). +An alternative interface to defining [trace translators](https://www.gen.dev/docs/dev/ref/trace_translators/). diff --git a/tutorials/data-driven-proposals/tutorial.md b/tutorials/data-driven-proposals/tutorial.md index ee5ab36fb..bd6db6222 100644 --- a/tutorials/data-driven-proposals/tutorial.md +++ b/tutorials/data-driven-proposals/tutorial.md @@ -513,7 +513,7 @@ Run inference using Gen's built-in importance resampling implementation. Use To see how to use the built-in importance resampling function, run ```?Gen.importance_resampling``` or check out the -[documentation](https://www.gen.dev/dev/ref/importance/#Gen.importance_resampling). +[documentation](https://www.gen.dev/docs/dev/ref/importance/#Gen.importance_resampling). We have provided some starter code. @@ -780,7 +780,7 @@ num_y_bins = 5; ``` We will propose the x-coordinate of the destination from a -[piecewise_uniform](https://www.gen.dev/dev/ref/distributions/#Gen.piecewise_uniform) +[piecewise_uniform](https://www.gen.dev/docs/dev/ref/distributions/#Gen.piecewise_uniform) distribution, where we set higher probability for certain bins based on the heuristic described above and use a uniform continuous distribution for the coordinate within a bin. The `compute_bin_probs` function below computes the @@ -1075,7 +1075,7 @@ end; Our choice of the `score_high` value of 5. was somewhat arbitrary. To use more informed value, we can make `score_high` into a [*trainable -parameter*](https://www.gen.dev/dev/ref/gfi/#Trainable-parameters-1) +parameter*](https://www.gen.dev/docs/dev/ref/gfi/#Trainable-parameters-1) of the generative function. Below, we write a new version of the proposal function that makes `score_high` trainable. However, the optimization algorithms we will use for training work best with *unconstrained* parameters @@ -1184,7 +1184,7 @@ end; Next, we choose type of optimization algorithm we will use for training. Gen supports a set of gradient-based optimization algorithms (see [Optimizing Trainable -Parameters](https://www.gen.dev/dev/ref/parameter_optimization/#Optimizing-Trainable-Parameters-1)). +Parameters](https://www.gen.dev/docs/dev/ref/parameter_optimization/#Optimizing-Trainable-Parameters-1)). Here we will use gradient descent with a fixed step size of 0.001. diff --git a/tutorials/intro-to-modeling/tutorial.md b/tutorials/intro-to-modeling/tutorial.md index 320e204b0..ef3b18212 100644 --- a/tutorials/intro-to-modeling/tutorial.md +++ b/tutorials/intro-to-modeling/tutorial.md @@ -771,7 +771,7 @@ Write an inference program that generates traces of `sine_model` that explain th What if we'd want to predict `ys` given `xs`? Using the API method -[`generate`](https://www.gen.dev/dev/ref/gfi/#Gen.generate), we +[`generate`](https://www.gen.dev/docs/dev/ref/gfi/#Gen.generate), we can generate a trace of a generative function in which the values of certain random choices are constrained to given values. The constraints are a choice map that maps the addresses of the constrained random choices to their diff --git a/tutorials/iterative-inference/tutorial.md b/tutorials/iterative-inference/tutorial.md index bd888f835..a451f235e 100644 --- a/tutorials/iterative-inference/tutorial.md +++ b/tutorials/iterative-inference/tutorial.md @@ -1097,7 +1097,7 @@ For example, let's say we wanted to take a trace and assign each point's `is_outlier` score to the most likely possibility. We can do this by iterating over both possible traces, scoring them, and choosing the one with the higher score. We can do this using Gen's -[`update`](https://www.gen.dev/dev/ref/gfi/#Update-1) function, +[`update`](https://www.gen.dev/docs/dev/ref/gfi/#Update-1) function, which allows us to manually update a trace to satisfy some constraints: diff --git a/tutorials/particle-filtering/tutorial.md b/tutorials/particle-filtering/tutorial.md index add743511..548fcea8a 100644 --- a/tutorials/particle-filtering/tutorial.md +++ b/tutorials/particle-filtering/tutorial.md @@ -45,12 +45,12 @@ We show how Gen's support for SMC integrates with its support for MCMC, enabling "bearings only tracking" problem described in [4]. This notebook will also introduce you to the -[`Unfold`](https://www.gen.dev/dev/ref/combinators/#Unfold-combinator-1) combinator, +[`Unfold`](https://www.gen.dev/docs/dev/ref/combinators/#Unfold-combinator-1) combinator, which can be used to improve performance of SMC. `Unfold` is just one example of the levers that Gen provides for improving performance; once you understand it, you can check Gen's documentation to see how similar principles apply to the -[`Map`](https://www.gen.dev/dev/ref/combinators/#Map-combinator-1) combinator +[`Map`](https://www.gen.dev/docs/dev/ref/combinators/#Map-combinator-1) combinator and to the static DSL. (These features are also covered in the previous tutorial, [Scaling with Combinators and the Static Modeling Language](../scaling-with-combinators-new/tutorial).) @@ -238,7 +238,7 @@ sample of `num_samples` traces from the weighted collection that the particle filter produces. Gen provides methods for initializing and updating the state of a particle -filter, documented in [Particle Filtering](https://www.gen.dev/dev/ref/pf/). +filter, documented in [Particle Filtering](https://www.gen.dev/docs/dev/ref/pf/). - `Gen.initialize_particle_filter` @@ -300,7 +300,7 @@ and then we introduce one additional bearing measurement by calling - The new arguments to the generative function for this step. In our case, this is the number of measurements beyond the first measurement. -- The [argdiff](https://www.gen.dev/dev/ref/gfi/#Argdiffs-1) +- The [argdiff](https://www.gen.dev/docs/dev/ref/gfi/#Argdiffs-1) value, which provides detailed information about the change to the arguments between the previous step and this step. We will revisit this value later. For now, we indicate that we do not know how the `T::Int` @@ -645,7 +645,7 @@ body whenever performing a trace update. This allows the built-in modeling DSL to be very flexible and to have a simple implementation, at the cost of performance. There are several ways of improving performance after one has a prototype written in the built-in modeling DSL. One of these is [Generative -Function Combinators](https://www.gen.dev/dev/ref/combinators/), which make +Function Combinators](https://www.gen.dev/docs/dev/ref/combinators/), which make the flow of information through the generative process more explicit to Gen, and enable asymptotically more efficient inference programs. @@ -676,7 +676,7 @@ Julia `for` loop in our model. This `for` loop has a very specific pattern of information flow—there is a sequence of states (represented by `x`, `y`, `vx`, and `vy`), and each state is generated from the previous state. This is exactly the pattern that the -[Unfold](https://www.gen.dev/dev/ref/combinators/#Unfold-combinator-1) +[Unfold](https://www.gen.dev/docs/dev/ref/combinators/#Unfold-combinator-1) generative function combinator is designed to handle. Below, we re-express the Julia `for` loop over the state sequence using the diff --git a/tutorials/regenerate/tutorial.md b/tutorials/regenerate/tutorial.md index 079455854..708e0e996 100644 --- a/tutorials/regenerate/tutorial.md +++ b/tutorials/regenerate/tutorial.md @@ -5,7 +5,7 @@ layout: splash # Reasoning About Regenerate -Gen provides a primitive called [`regenerate`](https://www.gen.dev/dev/ref/gfi/#Regenerate-1) that allows users to ask for certain random choices in a trace to be re-generated from scratch. `regenerate` is the basis of one variant of the [`metropolis_hastings`](https://www.gen.dev/dev/ref/mcmc/#Gen.metropolis_hastings) operator in Gen's inference library. +Gen provides a primitive called [`regenerate`](https://www.gen.dev/docs/dev/ref/gfi/#Regenerate-1) that allows users to ask for certain random choices in a trace to be re-generated from scratch. `regenerate` is the basis of one variant of the [`metropolis_hastings`](https://www.gen.dev/docs/dev/ref/mcmc/#Gen.metropolis_hastings) operator in Gen's inference library. This notebook aims to help you understand the computation that `regenerate` is performing. @@ -61,7 +61,7 @@ using Gen: regenerate, select, NoChange (trace, weight, retdiff) = regenerate(trace, (0.3,), (NoChange(),), select(:a)); ``` -Note that unlike [`update`](https://www.gen.dev/dev/ref/gfi/#Gen.update), we do not provide the new values for the random choices that we want to change. Instead, we simply pass in a [selection](https://www.gen.dev/dev/ref/selections/#Selections-1) indicating the addresses that we want to propose new values for. +Note that unlike [`update`](https://www.gen.dev/docs/dev/ref/gfi/#Gen.update), we do not provide the new values for the random choices that we want to change. Instead, we simply pass in a [selection](https://www.gen.dev/docs/dev/ref/selections/#Selections-1) indicating the addresses that we want to propose new values for. Note that `select(:a)` is equivalent to: ```julia @@ -91,7 +91,7 @@ get_choices(trace) Re-run the regenerate command until you get a trace where `a` is `false`. Note that the address `b` doesn't appear in the resulting trace. Then, run the command again until you get a trace where `a` is `true`. Note that now there is a value for `b`. This value of `b` was sampled along with the new value for `a`---`regenerate` will regenerate new values for the selected adddresses, but also any new addresses that may be introduced as a consequence of stochastic control flow. -What distribution is `regenerate` sampling the selected values from? It turns out that `regenerate` is using the [*internal proposal distribution family*](https://www.gen.dev/dev/ref/gfi/#Internal-proposal-distribution-family-1) $q(t; x, u)$, just like like `generate`. Recall that for `@gen` functions, the internal proposal distribution is based on *ancestral sampling*. But whereas `generate` was given the expicit choice map of constraints ($u$) as an argument, `regenerate` constructs $u$ by starting with the previous trace $t$ and then removing any selected addresses. In other words, `regenerate` is like `generate`, but where the constraints are the choices made in the previous trace less the selected choices. +What distribution is `regenerate` sampling the selected values from? It turns out that `regenerate` is using the [*internal proposal distribution family*](https://www.gen.dev/docs/dev/ref/gfi/#Internal-proposal-distribution-family-1) $q(t; x, u)$, just like like `generate`. Recall that for `@gen` functions, the internal proposal distribution is based on *ancestral sampling*. But whereas `generate` was given the expicit choice map of constraints ($u$) as an argument, `regenerate` constructs $u$ by starting with the previous trace $t$ and then removing any selected addresses. In other words, `regenerate` is like `generate`, but where the constraints are the choices made in the previous trace less the selected choices. We can make this concrete. Let us start with a deterministic trace again: diff --git a/tutorials/rj/tutorial.md b/tutorials/rj/tutorial.md index fe4b48d96..6e4f9975f 100644 --- a/tutorials/rj/tutorial.md +++ b/tutorials/rj/tutorial.md @@ -54,7 +54,7 @@ Given a dataset of `xs`, our model will randomly divide the range `(xmin, xmax)` It does this by sampling a number of segments (`:segment_count`), then sampling a vector of _proportions_ from a Dirichlet distribution (`:fractions`). The vector is guaranteed to sum to 1: if there are, say, three segments, this vector might be `[0.3, 0.5, 0.2]`. The length of each segment is the fraction of the interval assigned to it, times the length of the entire interval, e.g. `0.2 * (xmax - xmin)`. For each segmment, we generate a `y` value from a normal distribution. Finally, we sample the `y` values near the piecewise constant function described by the segments. ### Using `@dist` to define new distributions for convenience -To sample the number of segments, we need a distribution with support only on the positive integers. We create one using the [Distributions DSL](https://www.gen.dev/dev/ref/distributions/#dist_dsl-1): +To sample the number of segments, we need a distribution with support only on the positive integers. We create one using the [Distributions DSL](https://www.gen.dev/docs/dev/ref/distributions/#dist_dsl-1): ```julia diff --git a/tutorials/scaling-with-combinators-new/tutorial.md b/tutorials/scaling-with-combinators-new/tutorial.md index b84e60981..76fb25972 100644 --- a/tutorials/scaling-with-combinators-new/tutorial.md +++ b/tutorials/scaling-with-combinators-new/tutorial.md @@ -5,11 +5,11 @@ layout: splash # Scaling with Combinators and the Static Modeling Language -Up until this point, we have been using [Gen's generic built-in modeling language](https://www.gen.dev/dev/ref/modeling/), which is a very flexible modeling language that is shallowly embedded in Julia. However, better performance and scaling characteristics can be obtained using specialized modeling languages or modeling constructs. This notebook introduces two built-in features of Gen: +Up until this point, we have been using [Gen's generic built-in modeling language](https://www.gen.dev/docs/dev/ref/modeling/), which is a very flexible modeling language that is shallowly embedded in Julia. However, better performance and scaling characteristics can be obtained using specialized modeling languages or modeling constructs. This notebook introduces two built-in features of Gen: -- A more specialized [Static Modeling Language](https://www.gen.dev/dev/ref/modeling/#Static-Modeling-Language-1) which is built-in to Gen. +- A more specialized [Static Modeling Language](https://www.gen.dev/docs/dev/ref/modeling/#Static-Modeling-Language-1) which is built-in to Gen. -- A class of modeling constructs called [Generative function combinators](https://www.gen.dev/dev/ref/combinators/). +- A class of modeling constructs called [Generative function combinators](https://www.gen.dev/docs/dev/ref/combinators/). These features provide both constant-factor speedups, as well as improvements in asymptotic orders of growth, over the generic built-in modeling language.