Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove TransformVariables. #89

Merged
merged 12 commits into from
Aug 30, 2022
Merged
Show file tree
Hide file tree
Changes from 6 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/CI.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ jobs:
fail-fast: false
matrix:
version:
- '1.0' # Replace this with the minimum Julia version that your package supports. E.g. if your package requires Julia 1.5 or higher, change this to '1.5'.
- '1.6' # Replace this with the minimum Julia version that your package supports. E.g. if your package requires Julia 1.5 or higher, change this to '1.5'.
- '1' # Leave this line unchanged. '1' will automatically expand to the latest stable 1.x release of Julia.
# - 'nightly' # NOTE: nightly disabled as it currently fails
os:
Expand Down
4 changes: 1 addition & 3 deletions Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@ DiffResults = "163ba53b-c6d8-5494-b064-1a9d43ac40c5"
DocStringExtensions = "ffbed154-4ef7-542d-bbb7-c09d3a79fcae"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
Requires = "ae029012-a4dd-5104-9daa-d747884805df"
TransformVariables = "84d833dd-6860-57f9-a1a7-6da5db126cff"
UnPack = "3a884ed6-31ef-47d7-9d2a-63182c4928ed"

[compat]
Expand All @@ -18,9 +17,8 @@ BenchmarkTools = "1"
DiffResults = "0.0, 1"
tpapp marked this conversation as resolved.
Show resolved Hide resolved
DocStringExtensions = "0.8, 0.9"
Requires = "0.5, 1"
TransformVariables = "0.2, 0.3, 0.4, 0.5, 0.6"
UnPack = "0.1, 1"
julia = "1"
julia = "1.6"

[extras]
BenchmarkTools = "6e4b80f9-dd63-53aa-95a3-0cdb28fa8baf"
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,8 @@ A common framework for implementing and using log densities for inference, provi

2. The [`ADgradient`](https://tamaspapp.eu/LogDensityProblems.jl/dev/#LogDensityProblems.ADgradient) which makes objects that support `logdensity` to calculate log density *values* calculate log density *gradients* using various automatic differentiation packages.

3. The wrapper [`TransformedLogDensity`](https://tamaspapp.eu/LogDensityProblems.jl/dev/#LogDensityProblems.TransformedLogDensity) using the [TransformVariables.jl](https://github.com/tpapp/TransformVariables.jl) package, allowing callables that take a set of parameters transformed from a flat vector of real numbers to support the `logdensity` interface.
3. Various utility functions for debugging and testing log densities.

4. Various utility functions for debugging and testing log densities.
**NOTE** As of version FIXME, transformed log densities have been moved to FIXME

See the [documentation](https://tpapp.github.io/LogDensityProblems.jl/dev) for details.
85 changes: 4 additions & 81 deletions src/LogDensityProblems.jl
Original file line number Diff line number Diff line change
Expand Up @@ -12,14 +12,13 @@ documentation.
"""
module LogDensityProblems

export TransformedLogDensity, ADgradient
export ADgradient

using ArgCheck: @argcheck
using DocStringExtensions: SIGNATURES, TYPEDEF
using UnPack: @unpack
import Random
using Random: AbstractRNG, default_rng
devmotion marked this conversation as resolved.
Show resolved Hide resolved
using Requires: @require
import TransformVariables
using UnPack: @unpack

####
#### interface for problems
Expand Down Expand Up @@ -128,50 +127,6 @@ The first argument (the log density) can be shifted by a constant, see the note
"""
function logdensity_and_gradient end

#####
##### Transformed log density (typically Bayesian inference)
#####

"""
TransformedLogDensity(transformation, log_density_function)

A problem in Bayesian inference. Vectors of length compatible with the dimension (obtained
from `transformation`) are transformed into a general object `θ` (unrestricted type, but a
named tuple is recommended for clean code), correcting for the log Jacobian determinant of
the transformation.

`log_density_function(θ)` is expected to return *real numbers*. For zero densities or
infeasible `θ`s, `-Inf` or similar should be returned, but for efficiency of inference most
methods recommend using `transformation` to avoid this. It is recommended that
`log_density_function` is a callable object that also encapsulates the data for the problem.

Use the property accessors `ℓ.transformation` and `ℓ.log_density_function` to access the
arguments of `ℓ::TransformedLogDensity`, these are part of the public API.

# Usage note

This is the most convenient way to define log densities, as `capabilities`, `logdensity`,
and `dimension` are automatically defined. To obtain a type that supports derivatives, use
[`ADgradient`](@ref).
"""
struct TransformedLogDensity{T <: TransformVariables.AbstractTransform, L}
transformation::T
log_density_function::L
end

function Base.show(io::IO, ℓ::TransformedLogDensity)
print(io, "TransformedLogDensity of dimension $(dimension(ℓ))")
end

capabilities(::Type{<:TransformedLogDensity}) = LogDensityOrder{0}()

dimension(p::TransformedLogDensity) = TransformVariables.dimension(p.transformation)

function logdensity(p::TransformedLogDensity, x::AbstractVector)
@unpack transformation, log_density_function = p
TransformVariables.transform_logdensity(transformation, log_density_function, x)
end

#####
##### AD wrappers --- interface and generic code
#####
Expand Down Expand Up @@ -244,38 +199,6 @@ function __init__()
@require Enzyme="7da242da-08ed-463a-9acd-ee780be4f1d9" include("AD_Enzyme.jl")
end

####
#### stress testing
####

"""
$(SIGNATURES)

Test `ℓ` with random values.

`N` random vectors are drawn from a standard multivariate Cauchy distribution, scaled with
`scale` (which can be a scalar or a conformable vector).

Each random vector is then used as an argument in `f(ℓ, ...)`. `logdensity` and
`logdensity_and_gradient` are recommended for `f`.

In case the call produces an error, the value is recorded as a failure, which are returned
by the function.

Not exported, but part of the API.
"""
function stresstest(f, ℓ; N = 1000, rng::Random.AbstractRNG = Random.GLOBAL_RNG, scale = 1)
failures = Vector{Float64}[]
d = dimension(ℓ)
for _ in 1:N
x = TransformVariables.random_reals(d; scale = scale, cauchy = true, rng = rng)
try
f(ℓ, x)
catch e
push!(failures, x)
end
end
failures
end
include("utilities.jl")

end # module
61 changes: 61 additions & 0 deletions src/utilities.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
#####
##### utilities
#####

####
#### random reals
####

function _random_reals_scale(rng::AbstractRNG, scale::Real, cauchy::Bool)
cauchy ? scale / abs2(randn(rng)) : scale * 1.0
end


"""
$(SIGNATURES)

Random vector in ``ℝⁿ`` of length `n`.

A standard multivaritate normal or Cauchy is used, depending on `cauchy`, then scaled with
`scale`. `rng` is the random number generator used.

Not exported, but part of the API.
"""
function random_reals(n::Integer; scale::Real = 1, cauchy::Bool = false,
rng::AbstractRNG = default_rng())
randn(rng, n) .* _random_reals_scale(rng, scale, cauchy)
end

####
#### stress testing
####

"""
$(SIGNATURES)

Test `ℓ` with random values.

`N` random vectors are drawn from a standard multivariate Cauchy distribution, scaled with
`scale` (which can be a scalar or a conformable vector).

Each random vector is then used as an argument in `f(ℓ, ...)`. `logdensity` and
`logdensity_and_gradient` are recommended for `f`.

In case the call produces an error, the value is recorded as a failure, which are returned
by the function.

Not exported, but part of the API.
"""
function stresstest(f, ℓ; N = 1000, rng::AbstractRNG = default_rng(), scale = 1)
failures = Vector{Float64}[]
d = dimension(ℓ)
for _ in 1:N
x = random_reals(d; scale = scale, cauchy = true, rng = rng)
try
f(ℓ, x)
catch e
push!(failures, x)
end
end
failures
end
79 changes: 12 additions & 67 deletions test/runtests.jl
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,11 @@
struct EnzymeTestMode <: Enzyme.Mode end
end

using LogDensityProblems, Test, Distributions, TransformVariables, BenchmarkTools
using LogDensityProblems, Test, Distributions, BenchmarkTools
import LogDensityProblems: capabilities, dimension, logdensity
using LogDensityProblems: logdensity_and_gradient, LogDensityOrder

import ForwardDiff, Tracker, TransformVariables, Random, Zygote, ReverseDiff
import ForwardDiff, Tracker, Random, Zygote, ReverseDiff
using UnPack: @unpack

####
Expand Down Expand Up @@ -65,7 +65,7 @@ end
end

###
### simple log density for testing
### simple log densities for testing
###

struct TestLogDensity{F}
Expand All @@ -79,6 +79,10 @@ test_gradient(x) = x .* [-4, -6, -10]
TestLogDensity() = TestLogDensity(test_logdensity) # default: -Inf for negative input
Base.show(io::IO, ::TestLogDensity) = print(io, "TestLogDensity")

struct TestLogDensity2 end
logdensity(::TestLogDensity2, x) = -sum(abs2, x)
dimension(::TestLogDensity2) = 20

####
#### traits
####
Expand Down Expand Up @@ -153,13 +157,6 @@ end
@test LogDensityProblems.heuristic_chunks(82) == vcat(1:4:81, [82])
end

@testset "benchmark ForwardDiff chunk size" begin
ℓ = TransformedLogDensity(as(Array, 20), x -> -sum(abs2, x))
b = LogDensityProblems.benchmark_ForwardDiff_chunks(ℓ)
@test b isa Vector{Pair{Int,Float64}}
@test length(b) ≤ 20
end

@testset "AD via Tracker" begin
ℓ = TestLogDensity()
∇ℓ = ADgradient(:Tracker, ℓ)
Expand Down Expand Up @@ -221,65 +218,13 @@ end

@testset "ADgradient missing method" begin
msg = "Don't know how to AD with Foo, consider `import Foo` if there is such a package."
P = TransformedLogDensity(as(Array, 1), x -> sum(abs2, x))
@test_logs((:info, msg), @test_throws(MethodError, ADgradient(:Foo, P)))
end

####
#### transformed Bayesian problem
####

@testset "transformed Bayesian problem" begin
t = as((y = asℝ₊, ))
d = LogNormal(1.0, 2.0)
logposterior = ((x, ), ) -> logpdf(d, x)

# a Bayesian problem
p = TransformedLogDensity(t, logposterior)
@test repr(p) == "TransformedLogDensity of dimension 1"
@test dimension(p) == 1
@test p.transformation ≡ t
@test capabilities(p) == LogDensityOrder(0)

# gradient of a problem
∇p = ADgradient(:ForwardDiff, p)
@test dimension(∇p) == 1
@test parent(∇p).transformation ≡ t

for _ in 1:100
x = random_arg(t)
θ, lj = transform_and_logjac(t, x)
px = logdensity(p, x)
@test logpdf(d, θ.y) + lj ≈ (px::Real)
px2, ∇px = logdensity_and_gradient(∇p, x)
@test px2 == px
@test ∇px ≈ [ForwardDiff.derivative(x -> logpdf(d, exp(x)) + x, x[1])]
end
@test_logs((:info, msg), @test_throws(MethodError, ADgradient(:Foo, TestLogDensity2())))
end

@testset "-∞ log densities" begin
t = as(Array, 2)
validx = x -> all(x .> 0)
p = TransformedLogDensity(t, x -> validx(x) ? sum(abs2, x)/2 : -Inf)
∇p = ADgradient(:ForwardDiff, p)

@test dimension(p) == dimension(∇p) == TransformVariables.dimension(t)
@test p.transformation ≡ parent(∇p).transformation ≡ t

for _ in 1:100
x = random_arg(t)
px = logdensity(∇p, x)
px_∇px = logdensity_and_gradient(∇p, x)
@test px isa Real
@test px_∇px isa Tuple{Real,Any}
@test first(px_∇px) ≡ px
if validx(x)
@test px ≈ sum(abs2, x)/2
@test last(px_∇px) ≈ x
else
@test isinf(px)
end
end
@testset "benchmark ForwardDiff chunk size" begin
b = LogDensityProblems.benchmark_ForwardDiff_chunks(TestLogDensity2())
@test b isa Vector{Pair{Int,Float64}}
@test length(b) ≤ 20
end

@testset "stresstest" begin
Expand Down