-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use similar
in creation of DiffResults buffer
#95
Use similar
in creation of DiffResults buffer
#95
Conversation
@@ -14,7 +14,7 @@ Allocate a DiffResults buffer for a gradient, taking the element type of `x` int | |||
function _diffresults_buffer(ℓ, x) | |||
T = eltype(x) | |||
S = T <: Real ? float(Real) : Float64 # heuristic | |||
DiffResults.MutableDiffResult(zero(S), (Vector{S}(undef, dimension(ℓ)), )) | |||
DiffResults.MutableDiffResult(zero(S), (similar(x, S), )) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this still take into account dimension
? I'm not sure but I assume there was a reason why it was used instead of x
?
DiffResults.MutableDiffResult(zero(S), (similar(x, S), )) | |
DiffResults.MutableDiffResult(zero(S), (similar(x, S, dimension(ℓ)), )) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Currently all it uses from x
is its element type, and it takes the dimension from the first argument.
I think it could be done differently, using the dimension from x
, and then maybe ℓ
would not need to be an argument at all.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
using the dimension from x
So similar
basically does this. And AFAIK this is only called from the logdensity_and_gradient
method, meaning that x
should always have the correct dimensions.
Using dimension
from ℓ
instead will sometimes result in loss of structure of the original type for x
, e.g. ComponentArrays
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would recommend something like
function _diffresults_buffer(x)
T = eltype(x)
S = T <: Real ? float(Real) : Float64 # heuristic
DiffResults.MutableDiffResult(zero(S), (similar(x, S), ))
end
with the caller modified accordingly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is what I've impleented, right? Or are my eyes deceiving me?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think @tpapp's point is that the first argument of the function should be dropped if we do not use it anymore.
src/DiffResults_helpers.jl
Outdated
@@ -25,5 +25,6 @@ constructed with [`diffresults_buffer`](@ref). Gradient is not copied as caller | |||
vector. | |||
""" | |||
function _diffresults_extract(diffresult::DiffResults.DiffResult) | |||
# NOTE: Is this still needed? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's not affected by the change above, is it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I also don't understand why it would not be needed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nah, it was a note by me so we I could ask why it's there:) As in, can we just remove it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What would you use instead?
The purpose was to collect everything related to extraction in a single function. I am open to suggestions, but simply removing it would break the package (the function is used).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, I thought the ::Real
was the reason for it, and was thinking maybe this is now redundant because type-inference is better / DiffResults might have fixed whatever issue required the ::Real
in the first place. If that's not the case and it's just for convenience, then I'll remove the comment:)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I no longer recall the case for ::Real
, so I guess you can remove it. We can always put it back if things break and then document why it is there.
Please also remove the comment, and then I am happy to merge.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done:)
Aaah of course. I'll get that sorted.
…On Fri, Nov 18, 2022, 8:55 PM David Widmann ***@***.***> wrote:
***@***.**** commented on this pull request.
------------------------------
In src/DiffResults_helpers.jl
<#95 (comment)>
:
> @@ -14,7 +14,7 @@ Allocate a DiffResults buffer for a gradient, taking the element type of `x` int
function _diffresults_buffer(ℓ, x)
T = eltype(x)
S = T <: Real ? float(Real) : Float64 # heuristic
- DiffResults.MutableDiffResult(zero(S), (Vector{S}(undef, dimension(ℓ)), ))
+ DiffResults.MutableDiffResult(zero(S), (similar(x, S), ))
I think @tpapp <https://github.com/tpapp>'s point is that the first
argument of the function should be dropped if we do not use it anymore.
—
Reply to this email directly, view it on GitHub
<#95 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ACUPZZBP2DACRCX3OIRFX2DWI7ULTANCNFSM6AAAAAASDRYGCA>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, just remove ::Real
and the added comment.
src/DiffResults_helpers.jl
Outdated
@@ -25,5 +25,6 @@ constructed with [`diffresults_buffer`](@ref). Gradient is not copied as caller | |||
vector. | |||
""" | |||
function _diffresults_extract(diffresult::DiffResults.DiffResult) | |||
# NOTE: Is this still needed? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I no longer recall the case for ::Real
, so I guess you can remove it. We can always put it back if things break and then document why it is there.
Please also remove the comment, and then I am happy to merge.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks!
Using
similar
means that we get automatic compat with several other packages, e.g.ComponentArrays.jl
(the demo from the docs):Ref: TuringLang/AdvancedHMC.jl#301