Skip to content

Commit

Permalink
revise 3.1 Expected Improvement (EI)
Browse files Browse the repository at this point in the history
  • Loading branch information
xuyuting committed Aug 18, 2024
1 parent 5272736 commit 66ec24d
Show file tree
Hide file tree
Showing 2 changed files with 33 additions and 22 deletions.
2 changes: 1 addition & 1 deletion docs/wiki/4_SurrogateModel.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

## 1. Introduction

The `obsidian.surrogates` submodule is a key component of the Obsidian APO library. It provides a collection of surrogate models used to approximate the objective function in the optimization process. These surrogate models are essential for efficient exploration of the parameter space and for making informed decisions about which points to evaluate next.
The [`obsidian.surrogates`](https://github.com/MSDLLCpapers/obsidian/tree/main/obsidian/surrogates) submodule is a key component of the Obsidian APO library. It provides a collection of surrogate models used to approximate the objective function in the optimization process. These surrogate models are essential for efficient exploration of the parameter space and for making informed decisions about which points to evaluate next.

## 2. Basic Syntax

Expand Down
53 changes: 32 additions & 21 deletions docs/wiki/5_AcquisitionFunction.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,18 +73,39 @@ The `acquisition` parameter should be always a list, containing either string or
EI calculates the expected amount by which we will improve upon the current best observed value.

Mathematical formulation:
```
EI(x) = E[max(f(x) - f(x+), 0)]
```
where f(x+) is the current best observed value.
\begin{equation*}
EI(x) = E[max(\hat{f}(x) - y_{best}, 0)]
\end{equation*}

Example usage:
```python
from obsidian.optimizer import BayesianOptimizer
where $y_{best}$ is the current best observed value.
The expression $max(\hat{f}(x) - y_{best}$ captures the potential improvement over the current best observed value, and the expectation E[ ] calculates the average improvement over the posterior distribution of the surrogate model predictions.

optimizer = BayesianOptimizer(X_space=param_space)
X_suggest, eval_suggest = optimizer.suggest(acquisition=['EI'])
```
_Optional hyperparameters:_

* inflate: Increase the current best value $y_{best}$ to $(1+inflate)*y_{best}$, enabling a more flexible exploration-exploitation trade-off.

\begin{equation*}
EI(x) = E[max(\hat{f}(x) - (1+inflate)*y_{best}, 0)]
\end{equation*}

The default value is 0 (no inflation).

**Example usage:**

* Default:

```python
from obsidian.optimizer import BayesianOptimizer

optimizer = BayesianOptimizer(X_space=param_space)
X_suggest, eval_suggest = optimizer.suggest(acquisition=['EI'])
```

* With all available hyperparameters:

```python
X_suggest, eval_suggest = optimizer.suggest(acquisition=[{'EI': {'inflate': 0.05}}])
```

### 3.2 Upper Confidence Bound (UCB)

Expand Down Expand Up @@ -120,17 +141,7 @@ For multi-objective optimization problems, you can use specialized acquisition f
X_suggest, eval_suggest = optimizer.suggest(acquisition=['NEHVI'])
```

### 4.2 Customizing Acquisition Functions

Some acquisition functions accept parameters to customize their behavior. These can be specified in the `suggest` method:

```python
X_suggest, eval_suggest = optimizer.suggest(
acquisition=[{'EI': {'inflate': 0.01}}]
)
```

### 4.3 Custom Acquisition Functions
### 4.2 Custom Acquisition Functions

If you need to implement a custom acquisition function, you can extend the `MCAcquisitionFunction` class from BoTorch:

Expand Down

0 comments on commit 66ec24d

Please sign in to comment.