From 241031630bc58c9a3292ffdf44aa728bd831d8ba Mon Sep 17 00:00:00 2001 From: Qingyun Wu Date: Mon, 11 Nov 2024 15:59:49 -0800 Subject: [PATCH] udpate --- README.md | 50 ++++++++++++++++++++++++++------------------------ 1 file changed, 26 insertions(+), 24 deletions(-) diff --git a/README.md b/README.md index 21e04bb2d2..3d892deed4 100644 --- a/README.md +++ b/README.md @@ -21,20 +21,22 @@ > :fire: :tada: Nov 11, 2024: We are evolving AutoGen into AG2! A new organization [ag2labs](https://github.com/ag2labs) is created to host the development of AG2 and related projects with open governance. We invite collaborators from all organizations and individuals to join the development. -:fire: :tada: Sep 06, 2024: AG2 is available via `ag2` or `pyautogen` on PyPI! Starting with version 0.3.3, you can now install AG2 using: +:fire: :tada: Sep 06, 2024: AG2 is available via `ag2` (or its alias `autogen` or `pyautogen`) on PyPI! Starting with version 0.3.3, you can now install AG2 using: ``` pip install ag2 ``` or - ``` pip install pyautogen ``` +or +``` +pip install autogen +``` -**Note:** The previous package name `pyautogen` will remain valid for a transitional period. However, we encourage users to switch to the new, more intuitive `ag2` package name, as `pyautogen` will eventually be deprecated. -📄 **License Change:** -With this new release and package name, we are officially switching to the Apache 2.0 license. This enhances our commitment to open-source collaboration while providing additional protections for contributors and users alike. +📄 **License:** +We adopt the Apache 2.0 license from v0.3. This enhances our commitment to open-source collaboration while providing additional protections for contributors and users alike. :tada: May 29, 2024: DeepLearning.ai launched a new short course [AI Agentic Design Patterns with AutoGen](https://www.deeplearning.ai/short-courses/ai-agentic-design-patterns-with-autogen), made in collaboration with Microsoft and Penn State University, and taught by AutoGen creators [Chi Wang](https://github.com/sonichi) and [Qingyun Wu](https://github.com/qingyun-wu). @@ -45,11 +47,11 @@ With this new release and package name, we are officially switching to the Apach :tada: May 11, 2024: [AutoGen: Enabling Next-Gen LLM Applications via Multi-Agent Conversation](https://openreview.net/pdf?id=uAjxFFing2) received the best paper award at the [ICLR 2024 LLM Agents Workshop](https://llmagents.github.io/). - + :tada: Apr 17, 2024: Andrew Ng cited AutoGen in [The Batch newsletter](https://www.deeplearning.ai/the-batch/issue-245/) and [What's next for AI agentic workflows](https://youtu.be/sal78ACtGTc?si=JduUzN_1kDnMq0vF) at Sequoia Capital's AI Ascent (Mar 26). -:tada: Mar 3, 2024: What's new in AutoGen? 📰[Blog](https://ag2labs.github.io/autogen/blog/2024/03/03/AutoGen-Update); 📺[Youtube](https://www.youtube.com/watch?v=j_mtwQiaLGU). +:tada: Mar 3, 2024: What's new in AutoGen? 📰[Blog](https://ag2labs.github.io/ag2/blog/2024/03/03/AutoGen-Update); 📺[Youtube](https://www.youtube.com/watch?v=j_mtwQiaLGU). @@ -57,9 +59,9 @@ With this new release and package name, we are officially switching to the Apach :tada: Dec 31, 2023: [AutoGen: Enabling Next-Gen LLM Applications via Multi-Agent Conversation Framework](https://arxiv.org/abs/2308.08155) is selected by [TheSequence: My Five Favorite AI Papers of 2023](https://thesequence.substack.com/p/my-five-favorite-ai-papers-of-2023). - + - + :tada: Nov 8, 2023: AutoGen is selected into [Open100: Top 100 Open Source achievements](https://www.benchcouncil.org/evaluation/opencs/annual.html) 35 days after spinoff from [FLAML](https://github.com/microsoft/FLAML). @@ -76,7 +78,7 @@ With this new release and package name, we are officially switching to the Apach @@ -120,17 +122,17 @@ The easiest way to start playing is

-## [Installation](https://ag2labs.github.io/autogen/docs/Installation) +## [Installation](https://ag2labs.github.io/ag2/docs/Installation) ### Option 1. Install and Run AG2 in Docker -Find detailed instructions for users [here](https://ag2labs.github.io/autogen/docs/installation/Docker#step-1-install-docker), and for developers [here](https://ag2labs.github.io/autogen/docs/Contribute#docker-for-development). +Find detailed instructions for users [here](https://ag2labs.github.io/ag2/docs/installation/Docker#step-1-install-docker), and for developers [here](https://ag2labs.github.io/ag2/docs/Contribute#docker-for-development). ### Option 2. Install AG2 Locally AG2 requires **Python version >= 3.8, < 3.13**. It can be installed from pip: ```bash -pip install autogen +pip install ag2 ``` Minimal dependencies are installed without extra options. You can install extra options based on the feature you need. @@ -140,13 +142,13 @@ Minimal dependencies are installed without extra options. You can install extra pip install "autogen[blendsearch]" ``` --> -Find more options in [Installation](https://ag2labs.github.io/autogen/docs/Installation#option-2-install-autogen-locally-using-virtual-environment). +Find more options in [Installation](https://ag2labs.github.io/ag2/docs/Installation#option-2-install-autogen-locally-using-virtual-environment). -Even if you are installing and running AG2 locally outside of docker, the recommendation and default behavior of agents is to perform [code execution](https://ag2labs.github.io/autogen/docs/FAQ/#code-execution) in docker. Find more instructions and how to change the default behaviour [here](https://ag2labs.github.io/autogen/docs/Installation#code-execution-with-docker-(default)). +Even if you are installing and running AG2 locally outside of docker, the recommendation and default behavior of agents is to perform [code execution](https://ag2labs.github.io/ag2/docs/FAQ/#code-execution) in docker. Find more instructions and how to change the default behaviour [here](https://ag2labs.github.io/ag2/docs/Installation#code-execution-with-docker-(default)). -For LLM inference configurations, check the [FAQs](https://ag2labs.github.io/autogen/docs/FAQ#set-your-api-endpoints). +For LLM inference configurations, check the [FAQs](https://ag2labs.github.io/ag2/docs/FAQ#set-your-api-endpoints).

@@ -156,7 +158,7 @@ For LLM inference configurations, check the [FAQs](https://ag2labs.github.io/aut ## Multi-Agent Conversation Framework -AG2 enables the next-gen LLM applications with a generic [multi-agent conversation](https://ag2labs.github.io/autogen/docs/Use-Cases/agent_chat) framework. It offers customizable and conversable agents that integrate LLMs, tools, and humans. +AG2 enables the next-gen LLM applications with a generic [multi-agent conversation](https://ag2labs.github.io/ag2/docs/Use-Cases/agent_chat) framework. It offers customizable and conversable agents that integrate LLMs, tools, and humans. By automating chat among multiple capable agents, one can easily make them collectively perform tasks autonomously or with human feedback, including tasks that require using tools via code. Features of this use case include: @@ -170,7 +172,7 @@ For [example](https://github.com/ag2labs/ag2/blob/main/test/twoagent.py), ```python from autogen import AssistantAgent, UserProxyAgent, config_list_from_json # Load LLM inference endpoints from an env variable or a file -# See https://ag2labs.github.io/autogen/docs/FAQ#set-your-api-endpoints +# See https://ag2labs.github.io/ag2/docs/FAQ#set-your-api-endpoints # and OAI_CONFIG_LIST_sample config_list = config_list_from_json(env_or_file="OAI_CONFIG_LIST") # You can also set config_list directly as a list, for example, config_list = [{'model': 'gpt-4', 'api_key': ''},] @@ -191,7 +193,7 @@ The figure below shows an example conversation flow with AG2. ![Agent Chat Example](https://github.com/ag2labs/ag2/blob/main/website/static/img/chat_example.png) Alternatively, the [sample code](https://github.com/ag2labs/build-with-autogen/blob/main/samples/simple_chat.py) here allows a user to chat with an AG2 agent in ChatGPT style. -Please find more [code examples](https://ag2labs.github.io/autogen/docs/Examples#automated-multi-agent-chat) for this feature. +Please find more [code examples](https://ag2labs.github.io/ag2/docs/Examples#automated-multi-agent-chat) for this feature.

@@ -201,7 +203,7 @@ Please find more [code examples](https://ag2labs.github.io/autogen/docs/Examples ## Enhanced LLM Inferences -AG2 also helps maximize the utility out of the expensive LLMs such as ChatGPT and GPT-4. It offers [enhanced LLM inference](https://ag2labs.github.io/autogen/docs/Use-Cases/enhanced_inference#api-unification) with powerful functionalities like caching, error handling, multi-config inference and templating. +AG2 also helps maximize the utility out of the expensive LLMs such as ChatGPT and GPT-4. It offers [enhanced LLM inference](https://ag2labs.github.io/ag2/docs/Use-Cases/enhanced_inference#api-unification) with powerful functionalities like caching, error handling, multi-config inference and templating. +Please find more [code examples](https://ag2labs.github.io/ag2/docs/Examples#tune-gpt-models) for this feature. -->

@@ -230,15 +232,15 @@ Please find more [code examples](https://ag2labs.github.io/autogen/docs/Examples ## Documentation -You can find detailed documentation about AG2 [here](https://ag2labs.github.io/autogen/). +You can find detailed documentation about AG2 [here](https://ag2labs.github.io/ag2/). In addition, you can find: -- [Research](https://ag2labs.github.io/autogen/docs/Research), [blogposts](https://ag2labs.github.io/autogen/blog) around AG2, and [Transparency FAQs](https://github.com/ag2labs/ag2/blob/main/TRANSPARENCY_FAQS.md) +- [Research](https://ag2labs.github.io/ag2/docs/Research), [blogposts](https://ag2labs.github.io/ag2/blog) around AG2, and [Transparency FAQs](https://github.com/ag2labs/ag2/blob/main/TRANSPARENCY_FAQS.md) - [Discord](https://discord.gg/pAbnFJrkgZ) -- [Contributing guide](https://ag2labs.github.io/autogen/docs/Contribute) +- [Contributing guide](https://ag2labs.github.io/ag2/docs/Contribute)