Skip to content

Commit

Permalink
Merge pull request #89 from tpapp/tp/remove-transformvariables
Browse files Browse the repository at this point in the history
Remove TransformVariables.
  • Loading branch information
tpapp authored Aug 30, 2022
2 parents d3ea261 + a4fb30f commit 428ea4c
Show file tree
Hide file tree
Showing 9 changed files with 93 additions and 161 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/CI.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ jobs:
fail-fast: false
matrix:
version:
- '1.0' # Replace this with the minimum Julia version that your package supports. E.g. if your package requires Julia 1.5 or higher, change this to '1.5'.
- '1.6' # Replace this with the minimum Julia version that your package supports. E.g. if your package requires Julia 1.5 or higher, change this to '1.5'.
- '1' # Leave this line unchanged. '1' will automatically expand to the latest stable 1.x release of Julia.
# - 'nightly' # NOTE: nightly disabled as it currently fails
os:
Expand Down
6 changes: 2 additions & 4 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,15 +1,14 @@
name = "LogDensityProblems"
uuid = "6fdf6af0-433a-55f7-b3ed-c6c6e0b8df7c"
authors = ["Tamas K. Papp <[email protected]>"]
version = "0.12.0"
version = "1.0.0"

[deps]
ArgCheck = "dce04be8-c92d-5529-be00-80e4d2c0e197"
DiffResults = "163ba53b-c6d8-5494-b064-1a9d43ac40c5"
DocStringExtensions = "ffbed154-4ef7-542d-bbb7-c09d3a79fcae"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
Requires = "ae029012-a4dd-5104-9daa-d747884805df"
TransformVariables = "84d833dd-6860-57f9-a1a7-6da5db126cff"
UnPack = "3a884ed6-31ef-47d7-9d2a-63182c4928ed"

[compat]
Expand All @@ -18,9 +17,8 @@ BenchmarkTools = "1"
DiffResults = "0.0, 1"
DocStringExtensions = "0.8, 0.9"
Requires = "0.5, 1"
TransformVariables = "0.2, 0.3, 0.4, 0.5, 0.6"
UnPack = "0.1, 1"
julia = "1"
julia = "1.6"

[extras]
BenchmarkTools = "6e4b80f9-dd63-53aa-95a3-0cdb28fa8baf"
Expand Down
8 changes: 6 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,12 @@ A common framework for implementing and using log densities for inference, provi

2. The [`ADgradient`](https://tamaspapp.eu/LogDensityProblems.jl/dev/#LogDensityProblems.ADgradient) which makes objects that support `logdensity` to calculate log density *values* calculate log density *gradients* using various automatic differentiation packages.

3. The wrapper [`TransformedLogDensity`](https://tamaspapp.eu/LogDensityProblems.jl/dev/#LogDensityProblems.TransformedLogDensity) using the [TransformVariables.jl](https://github.com/tpapp/TransformVariables.jl) package, allowing callables that take a set of parameters transformed from a flat vector of real numbers to support the `logdensity` interface.
3. Various utility functions for debugging and testing log densities.

4. Various utility functions for debugging and testing log densities.
**NOTE** As of version 1.0, transformed log densities have been moved to [TransformedLogDensities.jl](https://github.com/tpapp/TransformedLogDensities.jl). Existing code that uses `TransformedLogDensity` should add
```
using TransformedLogDensities
```
or equivalent.

See the [documentation](https://tpapp.github.io/LogDensityProblems.jl/dev) for details.
1 change: 1 addition & 0 deletions docs/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
ForwardDiff = "f6369f11-7733-5829-9624-2563aa707210"
Tracker = "9f7883ad-71c0-57eb-9f7f-b5c9e6d3789c"
TransformVariables = "84d833dd-6860-57f9-a1a7-6da5db126cff"
TransformedLogDensities = "f9bc47f6-f3f8-4f3b-ab21-f8bc73906f26"
UnPack = "3a884ed6-31ef-47d7-9d2a-63182c4928ed"
Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"

Expand Down
2 changes: 1 addition & 1 deletion docs/make.jl
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
using Documenter, LogDensityProblems, ForwardDiff, Tracker, Zygote, BenchmarkTools
using Documenter, LogDensityProblems, ForwardDiff, Tracker, Zygote, BenchmarkTools, TransformedLogDensities

makedocs(
sitename = "LogDensityProblems.jl",
Expand Down
11 changes: 5 additions & 6 deletions docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -87,8 +87,11 @@ problem((μ = 0.0, σ = 1.0))

In our example, we require ``\sigma > 0``, otherwise the problem is meaningless. However, many MCMC samplers prefer to operate on *unconstrained* spaces ``\mathbb{R}^n``. The TransformVariables package was written to transform unconstrained to constrained spaces, and help with the log Jacobian correction (more on that later). That package has detailed documentation, now we just define a transformation from a length 2 vector to a `NamedTuple` with fields `μ` (unconstrained) and `σ > 0`.

!!! note
Since version 1.0, TransformedLogDensity has been moved to the package TransformedLogDensities.

```@repl 1
using LogDensityProblems, TransformVariables
using LogDensityProblems, TransformVariables, TransformedLogDensities
ℓ = TransformedLogDensity(as((μ = asℝ, σ = asℝ₊)), problem)
```

Expand All @@ -101,13 +104,9 @@ LogDensityProblems.logdensity(ℓ, zeros(2))
!!! note
Before running time-consuming algorithms like MCMC, it is advisable to test and benchmark your log density evaluations separately. The same applies to [`LogDensityProblems.logdensity_and_gradient`](@ref).

```@docs
TransformedLogDensity
```

## Manual unpacking and transformation

If you prefer to implement the transformation yourself, you just have to define the following three methods for your problem: declare that it can evaluate log densities (but not their gradient, hence the `0` order), allow the dimension of the problem to be queried, and then finally code the density calculation with the transformation. (Note that using [`TransformedLogDensity`](@ref) takes care of all of these for you, as shown above).
If you prefer to implement the transformation yourself, you just have to define the following three methods for your problem: declare that it can evaluate log densities (but not their gradient, hence the `0` order), allow the dimension of the problem to be queried, and then finally code the density calculation with the transformation. (Note that using `TransformedLogDensities.TransformedLogDensity` takes care of all of these for you, as shown above).

```@example 1
function LogDensityProblems.capabilities(::Type{<:NormalPosterior})
Expand Down
85 changes: 5 additions & 80 deletions src/LogDensityProblems.jl
Original file line number Diff line number Diff line change
Expand Up @@ -12,14 +12,13 @@ documentation.
"""
module LogDensityProblems

export TransformedLogDensity, ADgradient
export ADgradient

using ArgCheck: @argcheck
using DocStringExtensions: SIGNATURES, TYPEDEF
using UnPack: @unpack
import Random
using Random: AbstractRNG, default_rng
using Requires: @require
import TransformVariables
using UnPack: @unpack

####
#### interface for problems
Expand Down Expand Up @@ -128,50 +127,6 @@ The first argument (the log density) can be shifted by a constant, see the note
"""
function logdensity_and_gradient end

#####
##### Transformed log density (typically Bayesian inference)
#####

"""
TransformedLogDensity(transformation, log_density_function)
A problem in Bayesian inference. Vectors of length compatible with the dimension (obtained
from `transformation`) are transformed into a general object `θ` (unrestricted type, but a
named tuple is recommended for clean code), correcting for the log Jacobian determinant of
the transformation.
`log_density_function(θ)` is expected to return *real numbers*. For zero densities or
infeasible `θ`s, `-Inf` or similar should be returned, but for efficiency of inference most
methods recommend using `transformation` to avoid this. It is recommended that
`log_density_function` is a callable object that also encapsulates the data for the problem.
Use the property accessors `ℓ.transformation` and `ℓ.log_density_function` to access the
arguments of `ℓ::TransformedLogDensity`, these are part of the public API.
# Usage note
This is the most convenient way to define log densities, as `capabilities`, `logdensity`,
and `dimension` are automatically defined. To obtain a type that supports derivatives, use
[`ADgradient`](@ref).
"""
struct TransformedLogDensity{T <: TransformVariables.AbstractTransform, L}
transformation::T
log_density_function::L
end

function Base.show(io::IO, ℓ::TransformedLogDensity)
print(io, "TransformedLogDensity of dimension $(dimension(ℓ))")
end

capabilities(::Type{<:TransformedLogDensity}) = LogDensityOrder{0}()

dimension(p::TransformedLogDensity) = TransformVariables.dimension(p.transformation)

function logdensity(p::TransformedLogDensity, x::AbstractVector)
@unpack transformation, log_density_function = p
TransformVariables.transform_logdensity(transformation, log_density_function, x)
end

#####
##### AD wrappers --- interface and generic code
#####
Expand Down Expand Up @@ -244,38 +199,8 @@ function __init__()
@require Enzyme="7da242da-08ed-463a-9acd-ee780be4f1d9" include("AD_Enzyme.jl")
end

####
#### stress testing
####

"""
$(SIGNATURES)
Test `ℓ` with random values.
`N` random vectors are drawn from a standard multivariate Cauchy distribution, scaled with
`scale` (which can be a scalar or a conformable vector).
Each random vector is then used as an argument in `f(ℓ, ...)`. `logdensity` and
`logdensity_and_gradient` are recommended for `f`.
In case the call produces an error, the value is recorded as a failure, which are returned
by the function.
include("utilities.jl")

Not exported, but part of the API.
"""
function stresstest(f, ℓ; N = 1000, rng::Random.AbstractRNG = Random.GLOBAL_RNG, scale = 1)
failures = Vector{Float64}[]
d = dimension(ℓ)
for _ in 1:N
x = TransformVariables.random_reals(d; scale = scale, cauchy = true, rng = rng)
try
f(ℓ, x)
catch e
push!(failures, x)
end
end
failures
end
Base.@deprecate_moved TransformedLogDensity "TransformedLogDensities"

end # module
60 changes: 60 additions & 0 deletions src/utilities.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
#####
##### utilities
#####

####
#### random reals
####

function _random_reals_scale(rng::AbstractRNG, scale::Real, cauchy::Bool)
cauchy ? scale / abs2(randn(rng)) : scale * 1.0
end

"""
$(SIGNATURES)
Random vector in ``ℝⁿ`` of length `n`.
A standard multivaritate normal or Cauchy is used, depending on `cauchy`, then scaled with
`scale`. `rng` is the random number generator used.
Not exported, but part of the API.
"""
function random_reals(n::Integer; scale::Real = 1, cauchy::Bool = false,
rng::AbstractRNG = default_rng())
randn(rng, n) .* _random_reals_scale(rng, scale, cauchy)
end

####
#### stress testing
####

"""
$(SIGNATURES)
Test `ℓ` with random values.
`N` random vectors are drawn from a standard multivariate Cauchy distribution, scaled with
`scale` (which can be a scalar or a conformable vector).
Each random vector is then used as an argument in `f(ℓ, ...)`. `logdensity` and
`logdensity_and_gradient` are recommended for `f`.
In case the call produces an error, the value is recorded as a failure, which are returned
by the function.
Not exported, but part of the API.
"""
function stresstest(f, ℓ; N = 1000, rng::AbstractRNG = default_rng(), scale = 1)
failures = Vector{Float64}[]
d = dimension(ℓ)
for _ in 1:N
x = random_reals(d; scale = scale, cauchy = true, rng = rng)
try
f(ℓ, x)
catch e
push!(failures, x)
end
end
failures
end
79 changes: 12 additions & 67 deletions test/runtests.jl
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,11 @@
struct EnzymeTestMode <: Enzyme.Mode end
end

using LogDensityProblems, Test, Distributions, TransformVariables, BenchmarkTools
using LogDensityProblems, Test, Distributions, BenchmarkTools
import LogDensityProblems: capabilities, dimension, logdensity
using LogDensityProblems: logdensity_and_gradient, LogDensityOrder

import ForwardDiff, Tracker, TransformVariables, Random, Zygote, ReverseDiff
import ForwardDiff, Tracker, Random, Zygote, ReverseDiff
using UnPack: @unpack

####
Expand Down Expand Up @@ -65,7 +65,7 @@ end
end

###
### simple log density for testing
### simple log densities for testing
###

struct TestLogDensity{F}
Expand All @@ -79,6 +79,10 @@ test_gradient(x) = x .* [-4, -6, -10]
TestLogDensity() = TestLogDensity(test_logdensity) # default: -Inf for negative input
Base.show(io::IO, ::TestLogDensity) = print(io, "TestLogDensity")

struct TestLogDensity2 end
logdensity(::TestLogDensity2, x) = -sum(abs2, x)
dimension(::TestLogDensity2) = 20

####
#### traits
####
Expand Down Expand Up @@ -153,13 +157,6 @@ end
@test LogDensityProblems.heuristic_chunks(82) == vcat(1:4:81, [82])
end

@testset "benchmark ForwardDiff chunk size" begin
= TransformedLogDensity(as(Array, 20), x -> -sum(abs2, x))
b = LogDensityProblems.benchmark_ForwardDiff_chunks(ℓ)
@test b isa Vector{Pair{Int,Float64}}
@test length(b) 20
end

@testset "AD via Tracker" begin
= TestLogDensity()
∇ℓ = ADgradient(:Tracker, ℓ)
Expand Down Expand Up @@ -221,65 +218,13 @@ end

@testset "ADgradient missing method" begin
msg = "Don't know how to AD with Foo, consider `import Foo` if there is such a package."
P = TransformedLogDensity(as(Array, 1), x -> sum(abs2, x))
@test_logs((:info, msg), @test_throws(MethodError, ADgradient(:Foo, P)))
end

####
#### transformed Bayesian problem
####

@testset "transformed Bayesian problem" begin
t = as((y = asℝ₊, ))
d = LogNormal(1.0, 2.0)
logposterior = ((x, ), ) -> logpdf(d, x)

# a Bayesian problem
p = TransformedLogDensity(t, logposterior)
@test repr(p) == "TransformedLogDensity of dimension 1"
@test dimension(p) == 1
@test p.transformation t
@test capabilities(p) == LogDensityOrder(0)

# gradient of a problem
∇p = ADgradient(:ForwardDiff, p)
@test dimension(∇p) == 1
@test parent(∇p).transformation t

for _ in 1:100
x = random_arg(t)
θ, lj = transform_and_logjac(t, x)
px = logdensity(p, x)
@test logpdf(d, θ.y) + lj (px::Real)
px2, ∇px = logdensity_and_gradient(∇p, x)
@test px2 == px
@test ∇px [ForwardDiff.derivative(x -> logpdf(d, exp(x)) + x, x[1])]
end
@test_logs((:info, msg), @test_throws(MethodError, ADgradient(:Foo, TestLogDensity2())))
end

@testset "-∞ log densities" begin
t = as(Array, 2)
validx = x -> all(x .> 0)
p = TransformedLogDensity(t, x -> validx(x) ? sum(abs2, x)/2 : -Inf)
∇p = ADgradient(:ForwardDiff, p)

@test dimension(p) == dimension(∇p) == TransformVariables.dimension(t)
@test p.transformation parent(∇p).transformation t

for _ in 1:100
x = random_arg(t)
px = logdensity(∇p, x)
px_∇px = logdensity_and_gradient(∇p, x)
@test px isa Real
@test px_∇px isa Tuple{Real,Any}
@test first(px_∇px) px
if validx(x)
@test px sum(abs2, x)/2
@test last(px_∇px) x
else
@test isinf(px)
end
end
@testset "benchmark ForwardDiff chunk size" begin
b = LogDensityProblems.benchmark_ForwardDiff_chunks(TestLogDensity2())
@test b isa Vector{Pair{Int,Float64}}
@test length(b) 20
end

@testset "stresstest" begin
Expand Down

2 comments on commit 428ea4c

@tpapp
Copy link
Owner Author

@tpapp tpapp commented on 428ea4c Aug 30, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JuliaRegistrator
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Registration pull request created: JuliaRegistries/General/67337

After the above pull request is merged, it is recommended that a tag is created on this repository for the registered package version.

This will be done automatically if the Julia TagBot GitHub Action is installed, or can be done manually through the github interface, or via:

git tag -a v1.0.0 -m "<description of version>" 428ea4c3a7e2b261fe506bc9148f85aa4d2197c5
git push origin v1.0.0

Please sign in to comment.