Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RFC: Sampler-specific AD settings instead of global AD settings #1402

Closed
devmotion opened this issue Sep 6, 2020 · 1 comment
Closed

RFC: Sampler-specific AD settings instead of global AD settings #1402

devmotion opened this issue Sep 6, 2020 · 1 comment

Comments

@devmotion
Copy link
Member

Currently, AD settings in Turing are defined on a global level and (partly) propagated to other packages in this way. This requires us to dispatch depending on the Turing specific mutable global state and e.g. doesn't allow to perform parallel sampling (with a custom implementation) for different AD backends concurrently. The problem in #1400 and work on #1401 got me thinking: could we make the AD settings instead a local state of the AD-compatible samplers (similar to e.g. ODE algorithms in OrdinaryDiffEq)?

The main problem I see right now would be that other packages such as Bijectors or AdvancedVI still use a global state, which would have to be changed as well. Maybe Turing and other packages could use a common interface for computing gradients etc. with all supported AD packages that is defined in some other package. Then, for instance, Turing would not have to define gradient_logp for all supported AD backends but just call the method of this interface for computing the forward and reverse pass in lines such as

Turing.jl/src/core/ad.jl

Lines 144 to 147 in e1ab7e0

# Compute forward and reverse passes.
l_tracked, ȳ = Tracker.forward(f, θ)
# Remove tracking info from variables in model (because mutable state).
l::T, ∂l∂θ::typeof(θ) = Tracker.data(l_tracked), Tracker.data((1)[1])
.

@yebai
Copy link
Member

yebai commented Nov 12, 2022

Likely fixed by #1877

@yebai yebai closed this as completed Nov 12, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants