forked from microsoft/autogen
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
add chainlit sample (microsoft#4749)
* add chainlit sample * readme updates * put team inside run to avoid problems * simplify example and add readme * inform user team is reset --------- Co-authored-by: Hussein Mozannar <[email protected]>
- Loading branch information
1 parent
e902e94
commit c215552
Showing
5 changed files
with
217 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,63 @@ | ||
build | ||
dist | ||
|
||
*.egg-info | ||
|
||
.env | ||
|
||
*.files | ||
|
||
venv | ||
.venv | ||
.DS_Store | ||
|
||
.chainlit | ||
!cypress/e2e/**/*/.chainlit/* | ||
chainlit.md | ||
|
||
cypress/screenshots | ||
cypress/videos | ||
cypress/downloads | ||
|
||
__pycache__ | ||
|
||
.ipynb_checkpoints | ||
|
||
*.db | ||
|
||
.mypy_cache | ||
|
||
chat_files | ||
|
||
.chroma | ||
|
||
# Logs | ||
logs | ||
*.log | ||
npm-debug.log* | ||
yarn-debug.log* | ||
yarn-error.log* | ||
pnpm-debug.log* | ||
lerna-debug.log* | ||
|
||
node_modules | ||
dist | ||
dist-ssr | ||
*.local | ||
|
||
# Editor directories and files | ||
.vscode/* | ||
!.vscode/extensions.json | ||
.idea | ||
.DS_Store | ||
*.suo | ||
*.ntvs* | ||
*.njsproj | ||
*.sln | ||
*.sw? | ||
|
||
.aider* | ||
.coverage | ||
|
||
backend/README.md | ||
backend/.dmypy.json |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,108 @@ | ||
# Building a Multi-Agent Application with AutoGen and Chainlit | ||
|
||
In this sample, we will build a simple chat interface that interacts with a `RoundRobinGroupChat` team built using the [AutoGen AgentChat](https://microsoft.github.io/autogen/dev/user-guide/agentchat-user-guide/index.html) api. | ||
|
||
![AgentChat](docs/chainlit_autogen.png). | ||
|
||
## High-Level Description | ||
|
||
The `app.py` script sets up a Chainlit chat interface that communicates with the AutoGen team. When a chat starts, it | ||
|
||
- Initializes an AgentChat team. | ||
|
||
```python | ||
|
||
async def get_weather(city: str) -> str: | ||
return f"The weather in {city} is 73 degrees and Sunny." | ||
|
||
assistant_agent = AssistantAgent( | ||
name="assistant_agent", | ||
tools=[get_weather], | ||
model_client=OpenAIChatCompletionClient( | ||
model="gpt-4o-2024-08-06")) | ||
|
||
|
||
termination = TextMentionTermination("TERMINATE") | MaxMessageTermination(10) | ||
team = RoundRobinGroupChat( | ||
participants=[assistant_agent], termination_condition=termination) | ||
|
||
``` | ||
|
||
- As users interact with the chat, their queries are sent to the team which responds. | ||
- As agents respond/act, their responses are streamed back to the chat interface. | ||
|
||
## Quickstart | ||
|
||
To get started, ensure you have setup an API Key. We will be using the OpenAI API for this example. | ||
|
||
1. Ensure you have an OPENAPI API key. Set this key in your environment variables as `OPENAI_API_KEY`. | ||
|
||
2. Install the required Python packages by running: | ||
|
||
```shell | ||
pip install -r requirements.txt | ||
``` | ||
|
||
3. Run the `app.py` script to start the Chainlit server. | ||
|
||
```shell | ||
chainlit run app.py -h | ||
``` | ||
|
||
4. Interact with the Agent Team Chainlit interface. The chat interface will be available at `http://localhost:8000` by default. | ||
|
||
### Function Definitions | ||
|
||
- `start_chat`: Initializes the chat session | ||
- `run_team`: Sends the user's query to the team streams the agent responses back to the chat interface. | ||
- `chat`: Receives messages from the user and passes them to the `run_team` function. | ||
|
||
|
||
## Adding a UserProxyAgent | ||
|
||
We can add a `UserProxyAgent` to the team so that the user can interact with the team directly with the input box in the chat interface. This requires defining a function for input that uses the Chainlit input box instead of the terminal. | ||
|
||
```python | ||
from typing import Optional | ||
from autogen_core import CancellationToken | ||
from autogen_agentchat.agents import AssistantAgent, UserProxyAgent | ||
from autogen_agentchat.conditions import TextMentionTermination, MaxMessageTermination | ||
from autogen_agentchat.teams import RoundRobinGroupChat | ||
from autogen_ext.models.openai import OpenAIChatCompletionClient | ||
|
||
async def chainlit_input_func(prompt: str, cancellation_token: Optional[CancellationToken] = None) -> str: | ||
try: | ||
response = await cl.AskUserMessage( | ||
content=prompt, | ||
author="System", | ||
).send() | ||
return response["output"] | ||
|
||
except Exception as e: | ||
raise RuntimeError(f"Failed to get user input: {str(e)}") from e | ||
|
||
user_proxy_agent = UserProxyAgent( | ||
name="user_proxy_agent", | ||
input_func=chainlit_input_func, | ||
) | ||
assistant_agent = AssistantAgent( | ||
name="assistant_agent", | ||
model_client=OpenAIChatCompletionClient( | ||
model="gpt-4o-2024-08-06")) | ||
|
||
termination = TextMentionTermination("TERMINATE") | MaxMessageTermination(10) | ||
|
||
team = RoundRobinGroupChat( | ||
participants=[user_proxy_agent, assistant_agent], | ||
termination_condition=termination) | ||
``` | ||
|
||
|
||
|
||
## Next Steps (Extra Credit) | ||
|
||
In this example, we created a basic AutoGen team with a single agent in a RoundRobinGroupChat team. There are a few ways you can extend this example: | ||
|
||
- Add more [agents](https://microsoft.github.io/autogen/dev/user-guide/agentchat-user-guide/tutorial/agents.html) to the team. | ||
- Explor custom agents that sent multimodal messages | ||
- Explore more [team](https://microsoft.github.io/autogen/dev/user-guide/agentchat-user-guide/tutorial/teams.html) types beyond the `RoundRobinGroupChat`. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,41 @@ | ||
import chainlit as cl | ||
from autogen_agentchat.agents import AssistantAgent | ||
from autogen_agentchat.conditions import TextMentionTermination, MaxMessageTermination | ||
from autogen_agentchat.teams import RoundRobinGroupChat | ||
from autogen_ext.models.openai import OpenAIChatCompletionClient | ||
from autogen_agentchat.base import TaskResult | ||
|
||
|
||
async def get_weather(city: str) -> str: | ||
return f"The weather in {city} is 73 degrees and Sunny." | ||
|
||
|
||
@cl.on_chat_start | ||
async def start_chat(): | ||
cl.user_session.set( | ||
"prompt_history", | ||
"", | ||
) | ||
|
||
|
||
async def run_team(query: str): | ||
assistant_agent = AssistantAgent( | ||
name="assistant_agent", tools=[get_weather], model_client=OpenAIChatCompletionClient(model="gpt-4o-2024-08-06") | ||
) | ||
|
||
termination = TextMentionTermination("TERMINATE") | MaxMessageTermination(10) | ||
team = RoundRobinGroupChat(participants=[assistant_agent], termination_condition=termination) | ||
|
||
response_stream = team.run_stream(task=query) | ||
async for msg in response_stream: | ||
if hasattr(msg, "content"): | ||
msg = cl.Message(content=msg.content, author="Agent Team") | ||
await msg.send() | ||
if isinstance(msg, TaskResult): | ||
msg = cl.Message(content="Termination condition met. Team and Agents are reset.", author="Agent Team") | ||
await msg.send() | ||
|
||
|
||
@cl.on_message | ||
async def chat(message: cl.Message): | ||
await run_team(message.content) |
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,2 @@ | ||
chainlit | ||
autogen-agentchat==0.4.0.dev11 |