Skip to content

Commit

Permalink
merge main
Browse files Browse the repository at this point in the history
  • Loading branch information
jmoralez committed Dec 5, 2024
2 parents 9c27381 + 9997f8b commit 1aa71c8
Show file tree
Hide file tree
Showing 4 changed files with 450 additions and 192 deletions.
18 changes: 2 additions & 16 deletions nbs/docs/getting-started/7_why_timegpt.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -14,21 +14,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"/Users/yibeihu/opt/anaconda3/envs/report/lib/python3.9/site-packages/dask/dataframe/__init__.py:42: FutureWarning: \n",
"Dask dataframe query planning is disabled because dask-expr is not installed.\n",
"\n",
"You can install it with `pip install dask[dataframe]` or `conda install dask`.\n",
"This will raise in a future version.\n",
"\n",
" warnings.warn(msg, FutureWarning)\n"
]
}
],
"outputs": [],
"source": [
"#| hide \n",
"from nixtla.utils import in_colab"
Expand Down Expand Up @@ -775,7 +761,7 @@
"#### Benchmark Results\n",
"For a more comprehensive dive into model accuracy and performance, explore our [Time Series Model Arena](https://github.com/Nixtla/nixtla/tree/main/experiments/foundation-time-series-arena)! TimeGPT continues to lead the pack with exceptional performance across benchmarks! 🌟\n",
"\n",
"![benchmark results](../../assets/timeseries_model_arena.png)"
"<img src=\"../../assets/timeseries_model_arena.png\" />"
]
},
{
Expand Down
180 changes: 5 additions & 175 deletions nbs/docs/tutorials/06_finetuning.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -319,186 +319,14 @@
")"
]
},
{
"cell_type": "markdown",
"id": "cf9e168a",
"metadata": {},
"source": [
"### 3.1 Control the level of fine-tuning with `finetune_depth`\n",
"It is also possible to control the depth of fine-tuning with the `finetune_depth` parameter.\n",
"\n",
"`finetune_depth` takes values among `[1, 2, 3, 4, 5]`. By default, it is set to 1, which means that a small set of the model's parameters are being adjusted, whereas a value of 5 fine-tunes the maximum amount of parameters. Increasing `finetune_depth` also increases the time to generate predictions."
]
},
{
"cell_type": "markdown",
"id": "446f47ef",
"metadata": {},
"source": [
"Let's run a small experiment to see how `finetune_depth` impacts the performance."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "6a55e52f",
"metadata": {},
"outputs": [],
"source": [
"train = df[:-24]\n",
"test = df[-24:]"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "f71b895c",
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:nixtla.nixtla_client:Validating inputs...\n",
"INFO:nixtla.nixtla_client:Inferred freq: MS\n",
"WARNING:nixtla.nixtla_client:The specified horizon \"h\" exceeds the model horizon. This may lead to less accurate forecasts. Please consider using a smaller horizon.\n",
"INFO:nixtla.nixtla_client:Preprocessing dataframes...\n",
"INFO:nixtla.nixtla_client:Calling Forecast Endpoint...\n",
"INFO:nixtla.nixtla_client:Validating inputs...\n",
"INFO:nixtla.nixtla_client:Inferred freq: MS\n",
"WARNING:nixtla.nixtla_client:The specified horizon \"h\" exceeds the model horizon. This may lead to less accurate forecasts. Please consider using a smaller horizon.\n",
"INFO:nixtla.nixtla_client:Preprocessing dataframes...\n",
"INFO:nixtla.nixtla_client:Calling Forecast Endpoint...\n",
"INFO:nixtla.nixtla_client:Validating inputs...\n",
"INFO:nixtla.nixtla_client:Inferred freq: MS\n",
"WARNING:nixtla.nixtla_client:The specified horizon \"h\" exceeds the model horizon. This may lead to less accurate forecasts. Please consider using a smaller horizon.\n",
"INFO:nixtla.nixtla_client:Preprocessing dataframes...\n",
"INFO:nixtla.nixtla_client:Calling Forecast Endpoint...\n",
"INFO:nixtla.nixtla_client:Validating inputs...\n",
"INFO:nixtla.nixtla_client:Inferred freq: MS\n",
"WARNING:nixtla.nixtla_client:The specified horizon \"h\" exceeds the model horizon. This may lead to less accurate forecasts. Please consider using a smaller horizon.\n",
"INFO:nixtla.nixtla_client:Preprocessing dataframes...\n",
"INFO:nixtla.nixtla_client:Calling Forecast Endpoint...\n",
"INFO:nixtla.nixtla_client:Validating inputs...\n",
"INFO:nixtla.nixtla_client:Inferred freq: MS\n",
"WARNING:nixtla.nixtla_client:The specified horizon \"h\" exceeds the model horizon. This may lead to less accurate forecasts. Please consider using a smaller horizon.\n",
"INFO:nixtla.nixtla_client:Preprocessing dataframes...\n",
"INFO:nixtla.nixtla_client:Calling Forecast Endpoint...\n"
]
}
],
"source": [
"depths = [1, 2, 3, 4, 5]\n",
"\n",
"test = test.copy()\n",
"\n",
"for depth in depths:\n",
" preds_df = nixtla_client.forecast(\n",
" df=train, \n",
" h=24, \n",
" finetune_steps=5,\n",
" finetune_depth=depth,\n",
" time_col='timestamp', \n",
" target_col='value')\n",
"\n",
" preds = preds_df['TimeGPT'].values\n",
"\n",
" test.loc[:,f'TimeGPT_depth{depth}'] = preds"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "019b17f6",
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>unique_id</th>\n",
" <th>metric</th>\n",
" <th>TimeGPT_depth1</th>\n",
" <th>TimeGPT_depth2</th>\n",
" <th>TimeGPT_depth3</th>\n",
" <th>TimeGPT_depth4</th>\n",
" <th>TimeGPT_depth5</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>0</th>\n",
" <td>0</td>\n",
" <td>mae</td>\n",
" <td>22.805146</td>\n",
" <td>17.929682</td>\n",
" <td>21.320125</td>\n",
" <td>24.944233</td>\n",
" <td>28.735563</td>\n",
" </tr>\n",
" <tr>\n",
" <th>1</th>\n",
" <td>0</td>\n",
" <td>mse</td>\n",
" <td>683.303778</td>\n",
" <td>462.133945</td>\n",
" <td>678.182747</td>\n",
" <td>1003.023709</td>\n",
" <td>1119.906759</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"</div>"
],
"text/plain": [
" unique_id metric TimeGPT_depth1 TimeGPT_depth2 TimeGPT_depth3 \\\n",
"0 0 mae 22.805146 17.929682 21.320125 \n",
"1 0 mse 683.303778 462.133945 678.182747 \n",
"\n",
" TimeGPT_depth4 TimeGPT_depth5 \n",
"0 24.944233 28.735563 \n",
"1 1003.023709 1119.906759 "
]
},
"execution_count": null,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"test['unique_id'] = 0\n",
"\n",
"evaluation = evaluate(test, metrics=[mae, mse], time_col=\"timestamp\", target_col=\"value\")\n",
"evaluation"
]
},
{
"cell_type": "markdown",
"id": "62fc9cba-7c6e-4aef-9c68-e05d4fe8f7ba",
"metadata": {},
"source": [
"As you can see, increasing the depth of fine-tuning can improve the performance of the model, but it can make it worse too due to overfitting. \n",
"\n",
"Thus, keep in mind that fine-tuning can be a bit of trial and error. You might need to adjust the number of `finetune_steps` and the level of `finetune_depth` based on your specific needs and the complexity of your data. Usually, a higher `finetune_depth` works better for large datasets. In this specific tutorial, since we were forecasting a single series with a very short dataset, increasing the depth led to overfitting.\n",
"Keep in mind that fine-tuning can be a bit of trial and error. You might need to adjust the number of `finetune_steps` based on your specific needs and the complexity of your data. Usually, a larger value of `finetune_steps` works better for large datasets.\n",
"\n",
"It's recommended to monitor the model's performance during fine-tuning and adjust as needed. Be aware that more `finetune_steps` and a larger value of `finetune_depth` may lead to longer training times and could potentially lead to overfitting if not managed properly. \n",
"It's recommended to monitor the model's performance during fine-tuning and adjust as needed. Be aware that more `finetune_steps` may lead to longer training times and could potentially lead to overfitting if not managed properly. \n",
"\n",
"Remember, fine-tuning is a powerful feature, but it should be used thoughtfully and carefully."
]
Expand All @@ -508,7 +336,9 @@
"id": "8c546351",
"metadata": {},
"source": [
"For a detailed guide on using a specific loss function for fine-tuning, check out the [Fine-tuning with a specific loss function](https://docs.nixtla.io/docs/tutorials-fine_tuning_with_a_specific_loss_function) tutorial."
"For a detailed guide on using a specific loss function for fine-tuning, check out the [Fine-tuning with a specific loss function](https://docs.nixtla.io/docs/tutorials-fine_tuning_with_a_specific_loss_function) tutorial.\n",
"\n",
"Read also our detailed tutorial on [controlling the level of fine-tuning](https://docs.nixtla.io/docs/tutorials-finetune_depth_finetuning) using `finetune_depth`."
]
}
],
Expand Down
Loading

0 comments on commit 1aa71c8

Please sign in to comment.