Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

More friendly default sample interface #313

Closed
yebai opened this issue Jan 16, 2023 · 9 comments · Fixed by #325
Closed

More friendly default sample interface #313

yebai opened this issue Jan 16, 2023 · 9 comments · Fixed by #325
Assignees

Comments

@yebai
Copy link
Member

yebai commented Jan 16, 2023

We should consider supporting the following AbstractMCMC.sample interface (defined here)

sample(
    rng::Random.AbstractRNG=Random.default_rng(),
    logdensity::LogDensityProblem,
    sampler::AbstractSampler,
    N_or_isdone;
    kwargs...,
)

This allows users to pass a LogDensityProblem to the sample function instead of a hamiltonian object. We could default to Gaussian Kinect energy and canonical moment refreshment scheme. Moreover, we should define a few off-the-shelf sampler types rather than requiring the user to construct an HMC kernel from technically frightening components like leapfrog and trajectory termination types.

Such an intuitive sample interface can even be helpful for Turing models, allowing users to run MCMC without using out-of-date glue interfaces (correctly) criticised here.

@yebai
Copy link
Member Author

yebai commented Jan 16, 2023

@sethaxen any thoughts on how to design this API better?

@yebai
Copy link
Member Author

yebai commented Jan 16, 2023

cc @devmotion

@yebai yebai closed this as completed Jan 16, 2023
@yebai yebai reopened this Jan 16, 2023
@torfjelde
Copy link
Member

But that interface is supported, no? After #301 ?

@yebai
Copy link
Member Author

yebai commented Jan 16, 2023

I think we still need to explicitly construct the Hamiltonian type or LogDensityModel type and pass it to the sample function. Some cosmetic changes are needed in the abstractmcmc interface implementation to support an intuitive interface like the one in TuringLang/AdvancedMH.jl#76.

@torfjelde
Copy link
Member

I think we still need to explicitly construct the Hamiltonian type and pass it to the sample function

Nah, you can just construct the HMCSampler and pass that it:)

@sethaxen
Copy link
Member

sethaxen commented Jan 17, 2023

I think we still need to explicitly construct the Hamiltonian type and pass it to the sample function

Nah, you can just construct the HMCSampler and pass that it:)

@torfjelde can you give an example of what you mean?

In my view a user should be able to entirely configure the warm-up period and sampling period without requiring they specify initializations for step size, metric, and initial point, but allowing for the possibility that they specialize the initializations.

It should also be really easy to configure the warm-up phase(s). e.g. it should be possible to make initial step size selection part of the warm-up phase. It should be possible to initialize the sampling with a metric of one type but then adapt a different type of metric.

Maybe some of these are already possible. If so, awesome! It would be handy to document such cases. If it becomes possible to use Turing models directly with the sampler without needing to define a new sampler as Turing does, that would be amazing!

@yebai
Copy link
Member Author

yebai commented Jan 17, 2023

use Turing models directly with the sampler without needing to define a new sampler as Turing does, that would be amazing!

That's one nice goal for Turing (a by-product of the goal we want thoroughly decouple model specification and inference). We are very close to that point!

@yebai
Copy link
Member Author

yebai commented Feb 6, 2023

I looked at DynamicHMC's interface mcmc_with_warmup and mcmc_keep_warmup. What we need to include here is merely a well-documented user-facing API.

@devmotion, maybe we can make the AbstractMCMC.sample interface more intuitive so users can have a nice way to specify adaption, initialisation, algorithm choice etc. It seems Julia people love NamedTuple more than verbose arguments...

@devmotion
Copy link
Member

maybe we can make the AbstractMCMC.sample interface more intuitive so users can have a nice way to specify adaption, initialisation, algorithm choice etc. It seems Julia people love NamedTuple more than verbose arguments...

AbstractMCMC does not restrict how samplers want their options to be specified, it only adds some standard options such as discard_initial, thinning, and progress: https://turinglang.org/AbstractMCMC.jl/dev/api/#Common-keyword-arguments It's completely up to downstream packages if they want to support these, if they want to add more options (often the case, I assume), and if they want to nest options as named tuples. So I think specifying/changing options in AdvancedHMC does not (necessarily) require any changes in AbstractMCMC.

Generally, I think there's not a single design choice that's used everywhere in the ecosystem. SciML (in most places) uses a large set of (non-nested) keyword arguments, DynamicHMC nests some options in NamedTuple (I guess this might be useful mostly in cases where you have some hierarchy of options), and Optim supports options as Optim.Options.

@JaimeRZP JaimeRZP mentioned this issue Mar 9, 2023
@JaimeRZP JaimeRZP self-assigned this Mar 9, 2023
@JaimeRZP JaimeRZP linked a pull request Mar 9, 2023 that will close this issue
This was referenced Jun 26, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants