Skip to content

Commit

Permalink
Update documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
baniasbaabe committed Oct 25, 2023
1 parent 4876301 commit 880e564
Show file tree
Hide file tree
Showing 3 changed files with 148 additions and 1 deletion.
82 changes: 82 additions & 0 deletions _sources/book/machinelearning/modeltraining.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -1562,6 +1562,88 @@
"visualizer.score(X_test, y_test)\n",
"visualizer.show() "
]
},
{
"cell_type": "markdown",
"id": "00d30f01",
"metadata": {},
"source": [
"## Powerful and Distributed Hyperparameter Optimization with `ray.tune`"
]
},
{
"cell_type": "markdown",
"id": "39c5824a",
"metadata": {},
"source": [
"Do you need hyperparameter tuning on steroids?\n",
"\n",
"Try `tune` from `ray`.\n",
"\n",
"`tune` performs distributed hyperparameter tuning with multi-GPU and multi-node support, utilizing all the hardware you have.\n",
"\n",
"It supports the most popular ML libraries and integrates many other common hyperparameter optimization tools like Optuna or Hyperopt."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!pip install \"ray[tune]\""
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# !pip install \"ray[tune]\"\n",
"import sklearn.datasets\n",
"import sklearn.metrics\n",
"import sklearn.datasets\n",
"import sklearn.metrics\n",
"import xgboost as xgb\n",
"from ray import train, tune\n",
"from sklearn.model_selection import train_test_split\n",
"\n",
"\n",
"def train_breast_cancer(config):\n",
" data, labels = sklearn.datasets.load_breast_cancer(return_X_y=True)\n",
" train_x, test_x, train_y, test_y = train_test_split(data, labels, test_size=0.25)\n",
" train_set = xgb.DMatrix(train_x, label=train_y)\n",
" test_set = xgb.DMatrix(test_x, label=test_y)\n",
" results = {}\n",
" xgb.train(\n",
" config,\n",
" train_set,\n",
" evals=[(test_set, \"eval\")],\n",
" evals_result=results,\n",
" verbose_eval=False,\n",
" )\n",
" accuracy = 1.0 - results[\"eval\"][\"error\"][-1]\n",
" train.report({\"mean_accuracy\": accuracy, \"done\": True})\n",
"\n",
"\n",
"config = {\n",
" \"objective\": \"binary:logistic\",\n",
" \"eval_metric\": [\"logloss\", \"error\"],\n",
" \"min_child_weight\": tune.choice([1, 2, 3]),\n",
" \"subsample\": tune.uniform(0.5, 1.0),\n",
"}\n",
"\n",
"tuner = tune.Tuner(\n",
" train_breast_cancer,\n",
" tune_config=tune.TuneConfig(\n",
" num_samples=10,\n",
" ),\n",
" param_space=config,\n",
")\n",
"results = tuner.fit()\n",
"print(results.get_best_result(metric=\"mean_accuracy\", mode=\"max\").config)"
]
}
],
"metadata": {
Expand Down
65 changes: 65 additions & 0 deletions book/machinelearning/modeltraining.html
Original file line number Diff line number Diff line change
Expand Up @@ -430,6 +430,7 @@ <h2> Contents </h2>
<li class="toc-h2 nav-item toc-entry"><a class="reference internal nav-link" href="#visualize-high-performance-features-with-optuna">5.5.22. Visualize high-performance Features with <code class="docutils literal notranslate"><span class="pre">Optuna</span></code></a></li>
<li class="toc-h2 nav-item toc-entry"><a class="reference internal nav-link" href="#model-ensembling-with-combo">5.5.23. Model Ensembling with <code class="docutils literal notranslate"><span class="pre">combo</span></code></a></li>
<li class="toc-h2 nav-item toc-entry"><a class="reference internal nav-link" href="#residual-plots-with-yellowbrick">5.5.24. Residual Plots with <code class="docutils literal notranslate"><span class="pre">yellowbrick</span></code></a></li>
<li class="toc-h2 nav-item toc-entry"><a class="reference internal nav-link" href="#powerful-and-distributed-hyperparameter-optimization-with-ray-tune">5.5.25. Powerful and Distributed Hyperparameter Optimization with <code class="docutils literal notranslate"><span class="pre">ray.tune</span></code></a></li>
</ul>
</nav>
</div>
Expand Down Expand Up @@ -1430,6 +1431,69 @@ <h2><span class="section-number">5.5.24. </span>Residual Plots with <code class=
</div>
</div>
</section>
<section id="powerful-and-distributed-hyperparameter-optimization-with-ray-tune">
<h2><span class="section-number">5.5.25. </span>Powerful and Distributed Hyperparameter Optimization with <code class="docutils literal notranslate"><span class="pre">ray.tune</span></code><a class="headerlink" href="#powerful-and-distributed-hyperparameter-optimization-with-ray-tune" title="Permalink to this heading">#</a></h2>
<p>Do you need hyperparameter tuning on steroids?</p>
<p>Try <code class="docutils literal notranslate"><span class="pre">tune</span></code> from <code class="docutils literal notranslate"><span class="pre">ray</span></code>.</p>
<p><code class="docutils literal notranslate"><span class="pre">tune</span></code> performs distributed hyperparameter tuning with multi-GPU and multi-node support, utilizing all the hardware you have.</p>
<p>It supports the most popular ML libraries and integrates many other common hyperparameter optimization tools like Optuna or Hyperopt.</p>
<div class="cell docutils container">
<div class="cell_input docutils container">
<div class="highlight-ipython3 notranslate"><div class="highlight"><pre><span></span>!pip install &quot;ray[tune]&quot;
</pre></div>
</div>
</div>
</div>
<div class="cell docutils container">
<div class="cell_input docutils container">
<div class="highlight-ipython3 notranslate"><div class="highlight"><pre><span></span># !pip install &quot;ray[tune]&quot;
import sklearn.datasets
import sklearn.metrics
import sklearn.datasets
import sklearn.metrics
import xgboost as xgb
from ray import train, tune
from sklearn.model_selection import train_test_split


def train_breast_cancer(config):
data, labels = sklearn.datasets.load_breast_cancer(return_X_y=True)
train_x, test_x, train_y, test_y = train_test_split(data, labels, test_size=0.25)
train_set = xgb.DMatrix(train_x, label=train_y)
test_set = xgb.DMatrix(test_x, label=test_y)
results = {}
xgb.train(
config,
train_set,
evals=[(test_set, &quot;eval&quot;)],
evals_result=results,
verbose_eval=False,
)
accuracy = 1.0 - results[&quot;eval&quot;][&quot;error&quot;][-1]
train.report({&quot;mean_accuracy&quot;: accuracy, &quot;done&quot;: True})


config = {
&quot;objective&quot;: &quot;binary:logistic&quot;,
&quot;eval_metric&quot;: [&quot;logloss&quot;, &quot;error&quot;],
&quot;min_child_weight&quot;: tune.choice([1, 2, 3]),
&quot;subsample&quot;: tune.uniform(0.5, 1.0),
}

tuner = tune.Tuner(
train_breast_cancer,
tune_config=tune.TuneConfig(
num_samples=10,
),
param_space=config,
)
results = tuner.fit()
print(results.get_best_result(metric=&quot;mean_accuracy&quot;, mode=&quot;max&quot;).config)
</pre></div>
</div>
</div>
</div>
</section>
</section>

<script type="text/x-thebe-config">
Expand Down Expand Up @@ -1517,6 +1581,7 @@ <h2><span class="section-number">5.5.24. </span>Residual Plots with <code class=
<li class="toc-h2 nav-item toc-entry"><a class="reference internal nav-link" href="#visualize-high-performance-features-with-optuna">5.5.22. Visualize high-performance Features with <code class="docutils literal notranslate"><span class="pre">Optuna</span></code></a></li>
<li class="toc-h2 nav-item toc-entry"><a class="reference internal nav-link" href="#model-ensembling-with-combo">5.5.23. Model Ensembling with <code class="docutils literal notranslate"><span class="pre">combo</span></code></a></li>
<li class="toc-h2 nav-item toc-entry"><a class="reference internal nav-link" href="#residual-plots-with-yellowbrick">5.5.24. Residual Plots with <code class="docutils literal notranslate"><span class="pre">yellowbrick</span></code></a></li>
<li class="toc-h2 nav-item toc-entry"><a class="reference internal nav-link" href="#powerful-and-distributed-hyperparameter-optimization-with-ray-tune">5.5.25. Powerful and Distributed Hyperparameter Optimization with <code class="docutils literal notranslate"><span class="pre">ray.tune</span></code></a></li>
</ul>
</nav></div>

Expand Down
2 changes: 1 addition & 1 deletion searchindex.js

Large diffs are not rendered by default.

0 comments on commit 880e564

Please sign in to comment.