Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Re-enable numerical asserts since CI now uses Quarto 1.6 #539

Merged
merged 4 commits into from
Oct 29, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,10 @@ This repository is part of [Turing.jl's](https://turinglang.org/) website (i.e.

To get started with the docs website locally, you'll need to have [Quarto](https://quarto.org/docs/download/) installed.
Make sure you have at least version 1.5 of Quarto installed, as this is required to correctly run [the native Julia engine](https://quarto.org/docs/computations/julia.html#using-the-julia-engine).
Ideally, you should use Quarto 1.6.31 or later as this version fixes [a bug which causes random number generation between different cells to not be deterministic](https://github.com/TuringLang/docs/issues/533).
Note that as of October 2024, Quarto 1.6 is a pre-release version, so you may need to install it from source rather than via a package manager like Homebrew.

Once you have the prerequisite installed, you can follow these steps:
Once you have Quarto installed, you can follow these steps:

1. Clone this repository:

Expand Down
9 changes: 3 additions & 6 deletions tutorials/01-gaussian-mixture-model/index.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -142,8 +142,7 @@ let
# μ[1] and μ[2] can switch places, so we sort the values first.
chain = Array(chains[:, ["μ[1]", "μ[2]"], i])
μ_mean = vec(mean(chain; dims=1))
# TODO: https://github.com/TuringLang/docs/issues/533
# @assert isapprox(sort(μ_mean), μ; rtol=0.1) "Difference between estimated mean of μ ($(sort(μ_mean))) and data-generating μ ($μ) unexpectedly large!"
@assert isapprox(sort(μ_mean), μ; rtol=0.1) "Difference between estimated mean of μ ($(sort(μ_mean))) and data-generating μ ($μ) unexpectedly large!"
end
end
```
Expand Down Expand Up @@ -208,8 +207,7 @@ let
# μ[1] and μ[2] can no longer switch places. Check that they've found the mean
chain = Array(chains[:, ["μ[1]", "μ[2]"], i])
μ_mean = vec(mean(chain; dims=1))
# TODO: https://github.com/TuringLang/docs/issues/533
# @assert isapprox(sort(μ_mean), μ; rtol=0.4) "Difference between estimated mean of μ ($(sort(μ_mean))) and data-generating μ ($μ) unexpectedly large!"
@assert isapprox(sort(μ_mean), μ; rtol=0.4) "Difference between estimated mean of μ ($(sort(μ_mean))) and data-generating μ ($μ) unexpectedly large!"
end
end
```
Expand Down Expand Up @@ -349,8 +347,7 @@ let
# μ[1] and μ[2] can no longer switch places. Check that they've found the mean
chain = Array(chains[:, ["μ[1]", "μ[2]"], i])
μ_mean = vec(mean(chain; dims=1))
# TODO: https://github.com/TuringLang/docs/issues/533
# @assert isapprox(sort(μ_mean), μ; rtol=0.4) "Difference between estimated mean of μ ($(sort(μ_mean))) and data-generating μ ($μ) unexpectedly large!"
@assert isapprox(sort(μ_mean), μ; rtol=0.4) "Difference between estimated mean of μ ($(sort(μ_mean))) and data-generating μ ($μ) unexpectedly large!"
end
end
```
Expand Down
5 changes: 2 additions & 3 deletions tutorials/09-variational-inference/index.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -155,9 +155,8 @@ var(x), mean(x)
#| echo: false
let
v, m = (mean(rand(q, 2000); dims=2)...,)
# TODO: Fix these as they randomly fail https://github.com/TuringLang/docs/issues/533
# @assert isapprox(v, 1.022; atol=0.1) "Mean of s (VI posterior, 1000 samples): $v"
# @assert isapprox(m, -0.027; atol=0.03) "Mean of m (VI posterior, 1000 samples): $m"
@assert isapprox(v, 1.022; atol=0.1) "Mean of s (VI posterior, 1000 samples): $v"
@assert isapprox(m, -0.027; atol=0.03) "Mean of m (VI posterior, 1000 samples): $m"
end
```

Expand Down
13 changes: 5 additions & 8 deletions tutorials/11-probabilistic-pca/index.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -246,13 +246,10 @@ heatmap(
We can quantitatively check the absolute magnitudes of the column average of the gap between `mat_exp` and `mat_rec`:

```{julia}
#| echo: false
# let
# diff_matrix = mat_exp .- mat_rec
# @assert abs(mean(diff_matrix[:, 4])) <= 0.5 #0.327
# @assert abs(mean(diff_matrix[:, 5])) <= 0.5 #0.390
# @assert abs(mean(diff_matrix[:, 6])) <= 0.5 #0.326
# end
diff_matrix = mat_exp .- mat_rec
for col in 4:6
@assert abs(mean(diff_matrix[:, col])) <= 0.5
end
```

We observe that, using posterior mean, the recovered data matrix `mat_rec` has values align with the original data matrix - particularly the same pattern in the first and last 3 gene features are captured, which implies the inference and p-PCA decomposition are successful.
Expand Down Expand Up @@ -383,4 +380,4 @@ It can also thought as a matrix factorisation method, in which $\mathbf{X}=(\mat
[^2]: Probabilistic PCA by TensorFlow, "https://www.tensorflow.org/probability/examples/Probabilistic_PCA".
[^3]: Gareth M. James, Daniela Witten, Trevor Hastie, Robert Tibshirani, *An Introduction to Statistical Learning*, Springer, 2013.
[^4]: David Wipf, Srikantan Nagarajan, *A New View of Automatic Relevance Determination*, NIPS 2007.
[^5]: Christopher Bishop, *Pattern Recognition and Machine Learning*, Springer, 2006.
[^5]: Christopher Bishop, *Pattern Recognition and Machine Learning*, Springer, 2006.
Loading