Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issues using LKJL #100

Open
sethaxen opened this issue May 31, 2021 · 3 comments
Open

Issues using LKJL #100

sethaxen opened this issue May 31, 2021 · 3 comments

Comments

@sethaxen
Copy link
Collaborator

I came across some issues with MeasureTheory.LKJL while trying to use it with Soss. I'm posting the issue here because I'm not 100% sure which issues need to be resolved here or there.

using Soss, MeasureTheory, SampleChainsDynamicHMC

m = @model N begin
    σ ~ HalfNormal(1) |> iid(N)
    L ~ LKJL(N, 2)
    ŷ ~ Normal() |> iid(N)
    y = σ .* (L * ŷ)  # same as y ~ MvNormal(zeros(N), σ .* (L * L') .* σ')
end

Drawing random samples does not work.

julia> rand(m(4))
ERROR: MethodError: no method matching distproxy(::LKJL{4, (:η,), Tuple{Int64}})

Distributions' LKJ isn't equivalent to LKJL, so we instead overload rand ourselves:

julia> using LinearAlgebra, Random

julia> function Base.rand(rng::AbstractRNG, ::Type, d::LKJL{k}) where {k}
           return cholesky(rand(rng, MeasureTheory.Dists.LKJ(k, d.η))).L
       end;

julia> rand(LKJL(4, 2))  # works!
4×4 LowerTriangular{Float64, Matrix{Float64}}:
 1.0                             
 0.476038   0.879425              
 0.140615  -0.272203   0.95191     
 0.363825   0.384274  -0.341334  0.776824

julia> pred = rand(m(4))  # also works!
(y = [0.44037889402215297, 1.5938652701305567, -0.23702427205378396, -0.5424312905721139], σ = [1.1175273174818694, 1.7836751630330654, 1.3148335815282792, 0.3861023795641398], ŷ = [0.3940654399522521, 0.8021362778335628, 0.3654790805746502, -1.595705932814744], L = [1.0 0.0 0.0 0.0; 0.4261101513982992 0.9046712877478309 0.0 0.0; 0.2419199606694756 -0.6653662683891962 0.7062311671963476 0.0; -0.07132419227849153 -0.057620246988085724 0.37930698942361735 0.920716554921896])

julia> post = sample(DynamicHMCChain, m(4) | (y=pred.y,), 1_000)
ERROR: StackOverflowError:
Stacktrace:
 [1] xform::Pushforward{TransformVariables.CorrCholeskyFactor, ProductMeasure{MappedArrays.ReadonlyMappedArray{Lebesgue{ℝ}, 1, CartesianIndices{1, Tuple{Base.OneTo{Int64}}}, MeasureTheory.var"#24#25"{Lebesgue{ℝ}}}}}, _data::NamedTuple{(), Tuple{}})
   @ Soss ~/.julia/packages/Soss/9Knnt/src/primitives/xform.jl:134
 [2] xform::Pushforward{TransformVariables.CorrCholeskyFactor, ProductMeasure{MappedArrays.ReadonlyMappedArray{Lebesgue{ℝ}, 1, CartesianIndices{1, Tuple{Base.OneTo{Int64}}}, MeasureTheory.var"#24#25"{Lebesgue{ℝ}}}}}) (repeats 79983 times)
   @ Soss ~/.julia/packages/Soss/9Knnt/src/primitives/xform.jl:135

The issue is xform calls representative on its argument and then calls xform on that, but after calling representative on LKJL, further calls to representative are idempotent, so we overflow.
MeasureTheory defines TransformVariables.as for LKJL, so we just overload xform to point to that:

julia> using Soss: TransformVariables

julia> Soss.xform(d::LKJL, _data::NamedTuple=NamedTuple()) = TransformVariables.as(d);

julia> xform(LKJL(4, 2))
TransformVariables.CorrCholeskyFactor(4)

julia> post = sample(DynamicHMCChain, m(4) | (y=pred.y,), 1_000)
4000-element MultiChain with 4 chains and schema (σ = Vector{Float64}, ŷ = Vector{Float64}, L = UpperTriangular{Float64, Matrix{Float64}})
(σ = [0.85±0.66, 0.81±0.71, 0.84±0.51, 0.87±0.64], ŷ = [-0.01±0.85, 0.1±1.3, -0.21±1.1, -0.02±1.1], L = [1.0±0.0 0.007±0.33 0.049±0.35 -0.0±0.28; 0.0±0.0 0.942±0.078 0.039±0.3 0.02±0.34; 0.0±0.0 0.0±0.0 0.877±0.11 0.02±0.43; 0.0±0.0 0.0±0.0 0.0±0.0 0.77±0.17])

But while we expected L to be LowerTriangular, it's actually UpperTriangular:

julia> post.L[1]
4×4 UpperTriangular{Float64, Matrix{Float64}}:
 1.0  0.179572  -0.623481  -0.0554051
     0.983745  -0.121416   0.15332
               0.772353   0.225203
                         0.960576

This happens because even though LKJL is adapted from AltDistributions.jl, and this is in the same ecosystem as TransformVariables.jl, CorrCholeskyFactor transforms to an UpperTriangular matrix, while LKJL works with a LowerTriangular matrix.

@cscherrer cscherrer transferred this issue from cscherrer/Soss.jl May 31, 2021
@cscherrer
Copy link
Collaborator

Thanks @sethaxen. The LKJL here is in the very early stages, so I think this is very likely an issue with the implementation itself, so I've moved the issue to MeasureTheory. This is an important measure to have available, so I'll try to spend some time with it today. Hopefully it won't take too long, and I'm sure the detail you give above will be really helpful in getting this in place quickly.

BTW there are lots of (good, I think) changes on the way. I've added lots of comments to this file to better describe adding new parameterized measures:
https://github.com/cscherrer/MeasureTheory.jl/blob/cs-keywordcalls/src/parameterized/normal.jl

@sethaxen
Copy link
Collaborator Author

Thanks @sethaxen. The LKJL here is in the very early stages, so I think this is very likely an issue with the implementation itself, so I've moved the issue to MeasureTheory. This is an important measure to have available, so I'll try to spend some time with it today. Hopefully it won't take too long, and I'm sure the detail you give above will be really helpful in getting this in place quickly.

Great, thanks! Yes, I agree, especially since Distributions does not have this measure.

BTW there are lots of (good, I think) changes on the way. I've added lots of comments to this file to better describe adding new parameterized measures:
https://github.com/cscherrer/MeasureTheory.jl/blob/cs-keywordcalls/src/parameterized/normal.jl

Looks great! I'm especially loving the keyword aliases.

@sethaxen
Copy link
Collaborator Author

sethaxen commented Jun 1, 2021

Actually, I think there is an error in the logdensity from LKJL, which was carried over from AltDistributions. Compare with JuliaStats/Distributions.jl#1336 (I've verified the Jacobian determinant there with FiniteDifferences) and Stan's documentation on the density. Specifically, at https://github.com/cscherrer/MeasureTheory.jl/blob/v0.7.0/src/parameterized/LKJL.jl#L41, we have c = k + 1 + 2(η - 1), when it should be c = k + 2(η - 1).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants