Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New options for recompute(): maxError and attempts #70

Open
crsh opened this issue Apr 18, 2016 · 3 comments
Open

New options for recompute(): maxError and attempts #70

crsh opened this issue Apr 18, 2016 · 3 comments

Comments

@crsh
Copy link

crsh commented Apr 18, 2016

Hi Richard,

I'm currently writing an analysis script for a sequential Bayesian analysis. I will run the script every night to decide if I continue data collection. For obvious reasons I want to base this decision on precise estimates of the Bayes Factors, so I want to recompute the BayesFactor objects until a desired level of precision is reached, without having to manually fiddle with the number of iterations and recomputation attempts.

My current approach to this is quite simply this:

while(any(object@bayesFactor$error > 0.01)) object <- recompute(object)

I think it would be helpful to offer this functionality within recompute(), e.g., by providing an option maxError. Additionally, an option such as attempts could provide the maximum number of recomputation attempts to satisfy the maxError condition.

object <- recompute(object, maxError = 0.01, attempts = 25)

This, to me at least, would be quite helpful.

I'm happy to give this a shot myself and make a pull request, if you like the idea.

Best regards,
Frederik

@crsh
Copy link
Author

crsh commented Apr 29, 2016

The more I think about it, maxError would also be nice to have in the original call to e.g. anovaBF()

@richarddmorey
Copy link
Owner

Well, as to having it in the original call: there are cases where the Bayes factor is so large that it makes do difference. Consider a Bayes factor of 4e10 \pm 20%. Do we really want to have an algorithm get that under, say, 1%? Not really. The Bayes factor is so large that we would simply ignore the 20% error. Otherwise we're just wasting CPU cycles.

Perhaps what we need is iterative cycles of ignoring and zooming in on the models we want (or the best models), rather than simply going for a maxError.

@richarddmorey
Copy link
Owner

But a pull request for recompute would be welcome.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants