Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cross-Validation (pseudo-likelihood) Criterion #118

Open
evanmunro opened this issue Jul 27, 2020 · 1 comment
Open

Cross-Validation (pseudo-likelihood) Criterion #118

evanmunro opened this issue Jul 27, 2020 · 1 comment

Comments

@evanmunro
Copy link

evanmunro commented Jul 27, 2020

I am finding for a specific use case for Gaussian Process Regression that optimizing kernel hyperparameters using the marginal likelihood logpdf results in poor generalization.

I would like to try optimizing the kernel hyperparameters instead with respect to the pseudo-likelihood, Equation 5.11/5.12 in Rasmussen and Williams Chapter 5.

I'm happy to make an attempt at implementing this, but wondering if there have been any previous thoughts about doing this, or any suggestions on how to structure.

@willtebbutt
Copy link
Member

willtebbutt commented Jul 28, 2020

Hi @evanmunro . I hadn't given any previous thought to implementing this, but an implementation would be welcome.

In terms of the implementing the objective function, I would suggest taking inspiration from https://github.com/willtebbutt/Stheno.jl/blob/68c4ce277ddc7284971232bf002d7e273a76039e/src/abstract_gp.jl#L197

and would actually recommend putting your implementation of the objective immediately below the compute_intermediates implementation in the same file. I would also recommend against using the gradient computations suggested in the textbook, and instead just rely on Zygote.

Very happy to review whatever you produce :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants