Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FIX] Img links #558

Merged
merged 4 commits into from
Dec 6, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion nbs/docs/getting-started/1_introduction.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@
"\n",
"Self-attention, the revolutionary concept introduced by the paper [Attention is all you need](https://arxiv.org/abs/1706.03762), is the basis of this foundation model. TimeGPT model is not based on any existing large language model(LLM). Instead, it is independently trained on a vast amount of time series data, and the large transformer model is designed to minimize the forecasting error.\n",
"\n",
"<img src=\"../../img/timegpt_archi.png\" />\n",
"<img src=\"https://github.com/Nixtla/nixtla/blob/main/nbs/img/timegpt_archi.png?raw=true\" />\n",
"\n",
"The architecture consists of an encoder-decoder structure with multiple layers, each with residual connections and layer normalization. Finally, a linear layer maps the decoder’s output to the forecasting window dimension. The general intuition is that attention-based mechanisms are able to capture the diversity of past events and correctly extrapolate potential future distributions."
]
Expand Down
2 changes: 1 addition & 1 deletion nbs/docs/getting-started/7_why_timegpt.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -761,7 +761,7 @@
"#### Benchmark Results\n",
"For a more comprehensive dive into model accuracy and performance, explore our [Time Series Model Arena](https://github.com/Nixtla/nixtla/tree/main/experiments/foundation-time-series-arena)! TimeGPT continues to lead the pack with exceptional performance across benchmarks! 🌟\n",
"\n",
"<img src=\"../../assets/timeseries_model_arena.png\" />"
"<img src=\"https://github.com/Nixtla/nixtla/blob/main/nbs/img/timeseries_model_arena.png?raw=true\" alt=\"Benchmark\" />"
]
},
{
Expand Down
Loading