From 66ec24d462dadd026aae4445728eaa10b79b011b Mon Sep 17 00:00:00 2001 From: Yuting Xu <12775874+xuyuting@users.noreply.github.com> Date: Sun, 18 Aug 2024 11:56:25 -0400 Subject: [PATCH] revise 3.1 Expected Improvement (EI) --- docs/wiki/4_SurrogateModel.md | 2 +- docs/wiki/5_AcquisitionFunction.md | 53 ++++++++++++++++++------------ 2 files changed, 33 insertions(+), 22 deletions(-) diff --git a/docs/wiki/4_SurrogateModel.md b/docs/wiki/4_SurrogateModel.md index 1a9c3e4..47fd74b 100644 --- a/docs/wiki/4_SurrogateModel.md +++ b/docs/wiki/4_SurrogateModel.md @@ -2,7 +2,7 @@ ## 1. Introduction -The `obsidian.surrogates` submodule is a key component of the Obsidian APO library. It provides a collection of surrogate models used to approximate the objective function in the optimization process. These surrogate models are essential for efficient exploration of the parameter space and for making informed decisions about which points to evaluate next. +The [`obsidian.surrogates`](https://github.com/MSDLLCpapers/obsidian/tree/main/obsidian/surrogates) submodule is a key component of the Obsidian APO library. It provides a collection of surrogate models used to approximate the objective function in the optimization process. These surrogate models are essential for efficient exploration of the parameter space and for making informed decisions about which points to evaluate next. ## 2. Basic Syntax diff --git a/docs/wiki/5_AcquisitionFunction.md b/docs/wiki/5_AcquisitionFunction.md index 8b9f3b5..3a49534 100644 --- a/docs/wiki/5_AcquisitionFunction.md +++ b/docs/wiki/5_AcquisitionFunction.md @@ -73,18 +73,39 @@ The `acquisition` parameter should be always a list, containing either string or EI calculates the expected amount by which we will improve upon the current best observed value. Mathematical formulation: -``` -EI(x) = E[max(f(x) - f(x+), 0)] -``` -where f(x+) is the current best observed value. +\begin{equation*} +EI(x) = E[max(\hat{f}(x) - y_{best}, 0)] +\end{equation*} -Example usage: -```python -from obsidian.optimizer import BayesianOptimizer +where $y_{best}$ is the current best observed value. +The expression $max(\hat{f}(x) - y_{best}$ captures the potential improvement over the current best observed value, and the expectation E[ ] calculates the average improvement over the posterior distribution of the surrogate model predictions. -optimizer = BayesianOptimizer(X_space=param_space) -X_suggest, eval_suggest = optimizer.suggest(acquisition=['EI']) -``` +_Optional hyperparameters:_ + +* inflate: Increase the current best value $y_{best}$ to $(1+inflate)*y_{best}$, enabling a more flexible exploration-exploitation trade-off. + + \begin{equation*} + EI(x) = E[max(\hat{f}(x) - (1+inflate)*y_{best}, 0)] + \end{equation*} + + The default value is 0 (no inflation). + +**Example usage:** + +* Default: + + ```python + from obsidian.optimizer import BayesianOptimizer + + optimizer = BayesianOptimizer(X_space=param_space) + X_suggest, eval_suggest = optimizer.suggest(acquisition=['EI']) + ``` + +* With all available hyperparameters: + + ```python + X_suggest, eval_suggest = optimizer.suggest(acquisition=[{'EI': {'inflate': 0.05}}]) + ``` ### 3.2 Upper Confidence Bound (UCB) @@ -120,17 +141,7 @@ For multi-objective optimization problems, you can use specialized acquisition f X_suggest, eval_suggest = optimizer.suggest(acquisition=['NEHVI']) ``` -### 4.2 Customizing Acquisition Functions - -Some acquisition functions accept parameters to customize their behavior. These can be specified in the `suggest` method: - -```python -X_suggest, eval_suggest = optimizer.suggest( - acquisition=[{'EI': {'inflate': 0.01}}] -) -``` - -### 4.3 Custom Acquisition Functions +### 4.2 Custom Acquisition Functions If you need to implement a custom acquisition function, you can extend the `MCAcquisitionFunction` class from BoTorch: