diff --git a/notebooks/en/multiagent_web_assistant.ipynb b/notebooks/en/multiagent_web_assistant.ipynb index 1a9a7b52..84d62e28 100644 --- a/notebooks/en/multiagent_web_assistant.ipynb +++ b/notebooks/en/multiagent_web_assistant.ipynb @@ -32,8 +32,6 @@ "```\n", "Let's set up this system. \n", "\n", - "⚡️ Our agent will be powered by [meta-llama/Meta-Llama-3.1-70B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3.1-70B-Instruct) using `HfApiEngine` class that uses HF's Inference API: the Inference API allows to quickly and easily run any OS model.\n", - "\n", "Run the line below to install the required dependencies:" ] }, @@ -50,7 +48,27 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We will choose to have our model powered by [Qwen/Qwen2.5-72B-Instruct](https://huggingface.co/Qwen/Qwen2.5-72B-Instruct) since it's very powerful and available for free in the HF API." + "Let's login in order to call the HF Inference API:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from huggingface_hub import notebook_login\n", + "\n", + "notebook_login()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "⚡️ Our agent will be powered by [Qwen/Qwen2.5-72B-Instruct](https://huggingface.co/Qwen/Qwen2.5-72B-Instruct) using `HfApiEngine` class that uses HF's Inference API: the Inference API allows to quickly and easily run any OS model.\n", + "\n", + "_Note:_ The Inference API hosts models based on various criteria, and deployed models may be updated or replaced without prior notice. Learn more about it [here](https://huggingface.co/docs/api-inference/supported-models)." ] }, {