Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add gl global min #1

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open

Add gl global min #1

wants to merge 1 commit into from

Conversation

looraL
Copy link
Collaborator

@looraL looraL commented Oct 11, 2018

No description provided.

@@ -4,7 +4,7 @@ include("parameter.jl")
function record_current(io, i, j, estimate, estimates)
if io != nothing
prop = j/((i-1)*ne+j)
est = (1-prop)*estimates + (prop)*mean(estimates[1:j])
est = (1-prop)*estimate + (prop)*mean(estimates[1:j])
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this might be the bug we are looking for @Chamusssss !

OuterLevelTwisting(N, C, S, μ, innerlevel, Ψ, initialguess, n_restarts)
#initialguess = zeros(S)
#n_restarts = 20
n_restarts = 2
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

try avoid debugging comment.

end


get_result(t::OuterLevelTwisting) = t.μ
set_result!(t::OuterLevelTwisting, μ) = (t.μ[:] = μ)


function twist!(t::OuterLevelTwisting, parameter::Parameter)
function twist_local!(t::OuterLevelTwisting, parameter::Parameter)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

try merge twist_local and twist_global into one twist!. Specifically, might be good to add a flag in struct OuterLevelTwist that multiplex the two options. And in twist! implement different behavior based on that flag

@@ -261,7 +312,7 @@ end
⋅ Z ∼ N(μ, I) where μ is shifted mean that minimizes variance
⋅ W ∼ q where q is shifted bernoulli probability that minimizes upper bound on the variance
"""
function glassermanli_mc(parameter::Parameter, sample_size::Tuple{Int64, Int64}, extra_params=(nothing, nothing), io::Union{IO, Nothing}=nothing)
function glassermanli_mc_local(parameter::Parameter, sample_size::Tuple{Int64, Int64}, filename:: String, extra_params=(nothing, nothing))
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not a good idea to use filename as a mandatory argument to the function. What if I just want to see the result without saving intermediates into that file? The reason for the io argument previously was so that if you desire to see the intermediates, then supply that as an optional parameter to the function and omit it otherwise.

@@ -261,7 +312,7 @@ end
⋅ Z ∼ N(μ, I) where μ is shifted mean that minimizes variance
⋅ W ∼ q where q is shifted bernoulli probability that minimizes upper bound on the variance
"""
function glassermanli_mc(parameter::Parameter, sample_size::Tuple{Int64, Int64}, extra_params=(nothing, nothing), io::Union{IO, Nothing}=nothing)
function glassermanli_mc_local(parameter::Parameter, sample_size::Tuple{Int64, Int64}, filename:: String, extra_params=(nothing, nothing))
Copy link
Owner

@tt6746690 tt6746690 Oct 11, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would definitely want to merge local and global together.

  • if we have decided on which one to use, then hard-code the default as is
  • if we decide that we will do experiment comparing the two methods, then would supply a flag via extra_params that decides on if to use local or global. i.e. extra_param=(<value_for_mu> | "local" | "global", <value_for_sigma> | nothing). Or something like this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants