Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Accept LogDensityProblems objects #122

Merged
merged 23 commits into from
Feb 1, 2023
Merged

Accept LogDensityProblems objects #122

merged 23 commits into from
Feb 1, 2023

Conversation

sethaxen
Copy link
Member

@sethaxen sethaxen commented Jan 27, 2023

This PR removes the methods of pathfinder and multipathfinder that accepted both a function and its gradient and instead has a single method that takes either a function or an object implementing the LogDensityProblems interface.

It also moves away from using AbstractDifferentiation for ad backends (development seems to have stalled) in favor of using LogDensityProblemsAD. This should enable better compatibility with PPLs; Turing, Soss, and Tilde models all implement this interface.

Fixed #115

@codecov-commenter
Copy link

codecov-commenter commented Jan 27, 2023

Codecov Report

Base: 92.69% // Head: 92.64% // Decreases project coverage by -0.06% ⚠️

Coverage data is based on head (367fec7) compared to base (9e6bf45).
Patch coverage: 100.00% of modified lines in pull request are covered.

📣 This organization is not using Codecov’s GitHub App Integration. We recommend you install it so Codecov can continue to function properly for your repositories. Learn more

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #122      +/-   ##
==========================================
- Coverage   92.69%   92.64%   -0.06%     
==========================================
  Files          13       13              
  Lines         575      571       -4     
==========================================
- Hits          533      529       -4     
  Misses         42       42              
Impacted Files Coverage Δ
src/Pathfinder.jl 100.00% <ø> (ø)
src/transducers.jl 100.00% <ø> (ø)
src/multipath.jl 67.27% <100.00%> (+0.60%) ⬆️
src/optimize.jl 93.33% <100.00%> (-0.79%) ⬇️
src/singlepath.jl 88.63% <100.00%> (+1.13%) ⬆️

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report at Codecov.
📢 Do you have feedback about the report comment? Let us know in this issue.

@sethaxen
Copy link
Member Author

For some reason, these changes break reproducibility when parallelizing with multithreading. Strange.

@sethaxen
Copy link
Member Author

For some reason, these changes break reproducibility when parallelizing with multithreading. Strange.

Seems to be due to tpapp/LogDensityProblemsAD.jl#6, relates JuliaDiff/ForwardDiff.jl#573

@sethaxen sethaxen marked this pull request as ready for review January 31, 2023 22:59
@sethaxen
Copy link
Member Author

On further thought, I think I will go the way of AdvancedHMC and DynamicHMC and completely remove the option where a user passes a function. This simplifies the code. Pathfinder should probably be used most as a backend, and regardless, it's easy to construct LogDensityProblems from models in the major PPLs.

@sethaxen sethaxen merged commit b5d1917 into main Feb 1, 2023
@sethaxen sethaxen deleted the logdensityproblems branch February 1, 2023 20:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Supporting inputs implementing the LogDensityProblems interface
2 participants