Skip to content

Commit

Permalink
V0.3.0doc (#33)
Browse files Browse the repository at this point in the history
* 0.3.0

* remove badge

* pyautogen doc to autogen doc

* license

* wording

* wording

* add link to the title

* wording
  • Loading branch information
qingyun-wu authored Sep 5, 2024
1 parent cfae36c commit 6ef974a
Show file tree
Hide file tree
Showing 19 changed files with 52 additions and 54 deletions.
2 changes: 1 addition & 1 deletion .devcontainer/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ Feel free to modify these Dockerfiles for your specific project needs. Here are
- **Setting Environment Variables**: Add environment variables using the `ENV` command for any application-specific configurations. We have prestaged the line needed to inject your OpenAI_key into the docker environment as a environmental variable. Others can be staged in the same way. Just uncomment the line.
`# ENV OPENAI_API_KEY="{OpenAI-API-Key}"` to `ENV OPENAI_API_KEY="{OpenAI-API-Key}"`
- **Need a less "Advanced" Autogen build**: If the `./full/Dockerfile` is to much but you need more than advanced then update this line in the Dockerfile file.
`RUN pip install pyautogen[teachable,lmm,retrievechat,mathchat,blendsearch] autogenra` to install just what you need. `RUN pip install pyautogen[retrievechat,blendsearch] autogenra`
`RUN pip install autogen[teachable,lmm,retrievechat,mathchat,blendsearch] autogenra` to install just what you need. `RUN pip install autogen[retrievechat,blendsearch] autogenra`
- **Can't Dev without your favorite CLI tool**: if you need particular OS tools to be installed in your Docker container you can add those packages here right after the sudo for the `./base/Dockerfile` and `./full/Dockerfile` files. In the example below we are installing net-tools and vim to the environment.

```code
Expand Down
2 changes: 1 addition & 1 deletion .devcontainer/full/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ WORKDIR /home/autogen-ai

# Install Python packages
RUN pip install --upgrade pip
RUN pip install pyautogen[teachable,lmm,retrievechat,mathchat,blendsearch] autogenra
RUN pip install autogen[teachable,lmm,retrievechat,mathchat,blendsearch] autogenra
RUN pip install numpy pandas matplotlib seaborn scikit-learn requests urllib3 nltk pillow pytest beautifulsoup4

# Expose port
Expand Down
4 changes: 1 addition & 3 deletions autogen/agentchat/contrib/capabilities/text_compressors.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,9 +10,7 @@
try:
import llmlingua
except ImportError:
IMPORT_ERROR = ImportError(
"LLMLingua is not installed. Please install it with `pip install pyautogen[long-context]`"
)
IMPORT_ERROR = ImportError("LLMLingua is not installed. Please install it with `pip install autogen[long-context]`")
PromptCompressor = object
else:
from llmlingua import PromptCompressor
Expand Down
2 changes: 1 addition & 1 deletion autogen/agentchat/contrib/retrieve_user_proxy_agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
try:
import chromadb
except ImportError as e:
raise ImportError(f"{e}. You can try `pip install pyautogen[retrievechat]`, or install `chromadb` manually.")
raise ImportError(f"{e}. You can try `pip install autogen[retrievechat]`, or install `chromadb` manually.")
from autogen.agentchat import UserProxyAgent
from autogen.agentchat.agent import Agent
from autogen.agentchat.contrib.vectordb.base import Document, QueryResults, VectorDB, VectorDBFactory
Expand Down
6 changes: 3 additions & 3 deletions autogen/oai/completion.py
Original file line number Diff line number Diff line change
Expand Up @@ -580,7 +580,7 @@ def eval_func(responses, **data):
tune.ExperimentAnalysis: The tuning results.
"""
logger.warning(
"tuning via Completion.tune is deprecated in pyautogen v0.2 and openai>=1. "
"tuning via Completion.tune is deprecated in autogen, pyautogen v0.2 and openai>=1. "
"flaml.tune supports tuning more generically."
)
if ERROR:
Expand Down Expand Up @@ -792,7 +792,7 @@ def yes_or_no_filter(context, config, response):
- `pass_filter`: whether the response passes the filter function. None if no filter is provided.
"""
logger.warning(
"Completion.create is deprecated in pyautogen v0.2 and openai>=1. "
"Completion.create is deprecated in autogen, pyautogen v0.2 and openai>=1. "
"The new openai requires initiating a client for inference. "
"Please refer to https://autogen-ai.github.io/autogen/docs/Use-Cases/enhanced_inference#api-unification"
)
Expand Down Expand Up @@ -1183,7 +1183,7 @@ def start_logging(
reset_counter (bool): whether to reset the counter of the number of API calls.
"""
logger.warning(
"logging via Completion.start_logging is deprecated in pyautogen v0.2. "
"logging via Completion.start_logging is deprecated in autogen and pyautogen v0.2. "
"logging via OpenAIWrapper will be added back in a future release."
)
if ERROR:
Expand Down
6 changes: 3 additions & 3 deletions notebook/contributing.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,9 +36,9 @@ You don't need to explain in depth how to install AutoGen. Unless there are spec
``````
````{=mdx}
:::info Requirements
Install `pyautogen`:
Install `autogen`:
```bash
pip install pyautogen
pip install autogen
```
For more information, please refer to the [installation guide](/docs/installation/).
Expand All @@ -54,7 +54,7 @@ Or if extras are needed:
Some extra dependencies are needed for this notebook, which can be installed via pip:
```bash
pip install pyautogen[retrievechat] flaml[automl]
pip install autogen[retrievechat] flaml[automl]
```
For more information, please refer to the [installation guide](/docs/installation/).
Expand Down
4 changes: 2 additions & 2 deletions website/blog/2023-10-18-RetrieveChat/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -52,9 +52,9 @@ The conversation terminates if no more documents are available for the context.
## Basic Usage of RAG Agents
0. Install dependencies

Please install pyautogen with the [retrievechat] option before using RAG agents.
Please install autogen with the [retrievechat] option before using RAG agents.
```bash
pip install "pyautogen[retrievechat]"
pip install "autogen[retrievechat]"
```

RetrieveChat can handle various types of documents. By default, it can process
Expand Down
4 changes: 2 additions & 2 deletions website/blog/2023-10-26-TeachableAgent/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -36,10 +36,10 @@ AutoGen contains four code examples that use `Teachability`.

1. Install dependencies

Please install pyautogen with the [teachable] option before using `Teachability`.
Please install autogen with the [teachable] option before using `Teachability`.

```bash
pip install "pyautogen[teachable]"
pip install "autogen[teachable]"
```

2. Import agents
Expand Down
2 changes: 1 addition & 1 deletion website/blog/2023-11-06-LMM-Agent/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ GPT-4V represents the forefront in image comprehension, while LLaVA is an effici
Incorporate the `lmm` feature during AutoGen installation:

```bash
pip install "pyautogen[lmm]"
pip install "autogen[lmm]"
```

Subsequently, import the **Multimodal Conversable Agent** or **LLaVA Agent** from AutoGen:
Expand Down
2 changes: 1 addition & 1 deletion website/blog/2023-11-13-OAI-assistants/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ This integration shows great potential and synergy, and we plan to continue enha
## Installation

```bash
pip install pyautogen==0.2.0b5
pip install autogen
```

## Basic Example
Expand Down
2 changes: 1 addition & 1 deletion website/blog/2023-11-26-Agent-AutoBuild/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ up an endpoint server automatically without any user participation.
## Installation
- AutoGen:
```bash
pip install pyautogen[autobuild]
pip install autogen[autobuild]
```
- (Optional: if you want to use open-source LLMs) vLLM and FastChat
```bash
Expand Down
2 changes: 1 addition & 1 deletion website/blog/2024-03-03-AutoGen-Update/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -148,7 +148,7 @@ These tools have been used for improving the AutoGen library as well as applicat
We are making rapid progress in further improving the interface to make it even easier to build agent applications. For example:

- [AutoBuild](/blog/2023/11/26/Agent-AutoBuild). AutoBuild is an ongoing area of research to automatically create or select a group of agents for a given task and objective. If successful, it will greatly reduce the effort from users or developers when using the multi-agent technology. It also paves the way for agentic decomposition to handle complex tasks. It is available as an experimental feature and demonstrated in two modes: free-form [creation](https://github.com/microsoft/autogen/blob/main/notebook/autobuild_basic.ipynb) and [selection](https://github.com/microsoft/autogen/blob/main/notebook/autobuild_agent_library.ipynb) from a library.
- [AutoGen Studio](/blog/2023/12/01/AutoGenStudio). AutoGen Studio is a no-code UI for fast experimentation with the multi-agent conversations. It lowers the barrier of entrance to the AutoGen technology. Models, agents, and workflows can all be configured without writing code. And chatting with multiple agents in a playground is immediately available after the configuration. Although only a subset of `pyautogen` features are available in this sample app, it demonstrates a promising experience. It has generated tremendous excitement in the community.
- [AutoGen Studio](/blog/2023/12/01/AutoGenStudio). AutoGen Studio is a no-code UI for fast experimentation with the multi-agent conversations. It lowers the barrier of entrance to the AutoGen technology. Models, agents, and workflows can all be configured without writing code. And chatting with multiple agents in a playground is immediately available after the configuration. Although only a subset of `autogen` features are available in this sample app, it demonstrates a promising experience. It has generated tremendous excitement in the community.
- Conversation Programming+. The [AutoGen paper](https://arxiv.org/abs/2308.08155) introduced a key concept of _Conversation Programming_, which can be used to program diverse conversation patterns such as 1-1 chat, group chat, hierarchical chat, nested chat etc. While we offered dynamic group chat as an example of high-level orchestration, it made other patterns relatively less discoverable. Therefore, we have added more convenient conversation programming features which enables easier definition of other types of complex workflow, such as [finite state machine based group chat](/blog/2024/02/11/FSM-GroupChat), [sequential chats](/docs/notebooks/agentchats_sequential_chats), and [nested chats](/docs/notebooks/agentchat_nestedchat). Many users have found them useful in implementing specific patterns, which have been always possible but more obvious with the added features. I will write another blog post for a deep dive.

### Learning/Optimization/Teaching
Expand Down
8 changes: 4 additions & 4 deletions website/blog/2024-06-24-AltModels-Classes/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -72,10 +72,10 @@ Now it's time to try them out.
Install the appropriate client based on the model you wish to use.

```sh
pip install pyautogen["mistral"] # for Mistral AI client
pip install pyautogen["anthropic"] # for Anthropic client
pip install pyautogen["together"] # for Together.AI client
pip install pyautogen["groq"] # for Groq client
pip install autogen["mistral"] # for Mistral AI client
pip install autogen["anthropic"] # for Anthropic client
pip install autogen["together"] # for Together.AI client
pip install autogen["groq"] # for Groq client
```

### Configuration Setup
Expand Down
6 changes: 3 additions & 3 deletions website/docs/FAQ.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,12 @@ import TOCInline from "@theme/TOCInline";

<TOCInline toc={toc} />

## Install the correct package - `pyautogen`
## Install the correct package - `autogen`

The name of Autogen package at PyPI is `pyautogen`:
The name of Autogen package at PyPI is `autogen`:

```
pip install pyautogen
pip install autogen
```

Typical errors that you might face when using the wrong package are `AttributeError: module 'autogen' has no attribute 'Agent'`, `AttributeError: module 'autogen' has no attribute 'config_list_from_json'` etc.
Expand Down
4 changes: 2 additions & 2 deletions website/docs/Getting-Started.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ Microsoft, Penn State University, and University of Washington.
### Quickstart

```sh
pip install pyautogen
pip install autogen
```
:::tip
You can also install with different [optional dependencies](/docs/installation/Optional-Dependencies).
Expand Down Expand Up @@ -135,7 +135,7 @@ The figure below shows an example conversation flow with AutoGen.
- Read the [API](/docs/reference/agentchat/conversable_agent/) docs
- Learn about [research](/docs/Research) around AutoGen
- Chat on [Discord](https://aka.ms/autogen-dc)
- Follow on [Twitter](https://twitter.com/pyautogen)
- Follow on [Twitter](https://twitter.com/Chi_Wang_)
- See our [roadmaps](https://aka.ms/autogen-roadmap)

If you like our project, please give it a [star](https://github.com/autogen-ai/autogen/stargazers) on GitHub. If you are interested in contributing, please read [Contributor's Guide](/docs/contributor-guide/contributing).
Expand Down
14 changes: 7 additions & 7 deletions website/docs/installation/Installation.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,8 @@ When installing AutoGen locally, we recommend using a virtual environment for th
Create and activate:

```bash
python3 -m venv pyautogen
source pyautogen/bin/activate
python3 -m venv autogen
source autogen/bin/activate
```

To deactivate later, run:
Expand All @@ -32,8 +32,8 @@ When installing AutoGen locally, we recommend using a virtual environment for th
Create and activate:

```bash
conda create -n pyautogen python=3.10
conda activate pyautogen
conda create -n autogen python=3.10
conda activate autogen
```

To deactivate later, run:
Expand All @@ -52,7 +52,7 @@ When installing AutoGen locally, we recommend using a virtual environment for th
poetry init
poetry shell

poetry add pyautogen
poetry add autogen
```

To deactivate later, run:
Expand All @@ -69,12 +69,12 @@ When installing AutoGen locally, we recommend using a virtual environment for th
AutoGen requires **Python version >= 3.8, < 3.13**. It can be installed from pip:

```bash
pip install pyautogen
pip install autogen
```

:::info

`pyautogen<0.2` required `openai<1`. Starting from pyautogen v0.2, `openai>=1` is required.
`openai>=1` is required.

:::

Expand Down
28 changes: 14 additions & 14 deletions website/docs/installation/Optional-Dependencies.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
AutoGen installs OpenAI package by default. To use LLMs by other providers, you can install the following packages:

```bash
pip install pyautogen[gemini,anthropic,mistral,together,groq,cohere]
pip install autogen[gemini,anthropic,mistral,together,groq,cohere]
```

Check out the [notebook](/docs/notebooks/autogen_uniformed_api_calling) and
Expand All @@ -17,7 +17,7 @@ To use LLM caching with Redis, you need to install the Python package with
the option `redis`:

```bash
pip install "pyautogen[redis]"
pip install "autogen[redis]"
```

See [LLM Caching](/docs/topics/llm-caching) for details.
Expand All @@ -28,7 +28,7 @@ To use the IPython code executor, you need to install the `jupyter-client`
and `ipykernel` packages:

```bash
pip install "pyautogen[ipython]"
pip install "autogen[ipython]"
```

To use the IPython code executor:
Expand All @@ -44,27 +44,27 @@ proxy = UserProxyAgent(name="proxy", code_execution_config={"executor": "ipython
`pyautogen<0.2` offers a cost-effective hyperparameter optimization technique [EcoOptiGen](https://arxiv.org/abs/2303.04673) for tuning Large Language Models. Please install with the [blendsearch] option to use it.

```bash
pip install "pyautogen[blendsearch]<0.2"
pip install "autogen[blendsearch]<0.2"
```

Checkout [Optimize for Code Generation](https://github.com/autogen-ai/autogen/blob/main/notebook/oai_completion.ipynb) and [Optimize for Math](https://github.com/autogen-ai/autogen/blob/main/notebook/oai_chatgpt_gpt4.ipynb) for details.

## retrievechat

`pyautogen` supports retrieval-augmented generation tasks such as question answering and code generation with RAG agents. Please install with the [retrievechat] option to use it with ChromaDB.
`autogen` supports retrieval-augmented generation tasks such as question answering and code generation with RAG agents. Please install with the [retrievechat] option to use it with ChromaDB.

```bash
pip install "pyautogen[retrievechat]"
pip install "autogen[retrievechat]"
```

Alternatively `pyautogen` also supports PGVector and Qdrant which can be installed in place of ChromaDB, or alongside it.
Alternatively `autogen` also supports PGVector and Qdrant which can be installed in place of ChromaDB, or alongside it.

```bash
pip install "pyautogen[retrievechat-pgvector]"
pip install "autogen[retrievechat-pgvector]"
```

```bash
pip install "pyautogen[retrievechat-qdrant]"
pip install "autogen[retrievechat-qdrant]"
```

RetrieveChat can handle various types of documents. By default, it can process
Expand All @@ -89,7 +89,7 @@ Example notebooks:
To use Teachability, please install AutoGen with the [teachable] option.

```bash
pip install "pyautogen[teachable]"
pip install "autogen[teachable]"
```

Example notebook: [Chatting with a teachable agent](/docs/notebooks/agentchat_teachability)
Expand All @@ -99,7 +99,7 @@ Example notebook: [Chatting with a teachable agent](/docs/notebooks/agentchat_te
We offered Multimodal Conversable Agent and LLaVA Agent. Please install with the [lmm] option to use it.

```bash
pip install "pyautogen[lmm]"
pip install "autogen[lmm]"
```

Example notebook: [LLaVA Agent](/docs/notebooks/agentchat_lmm_llava)
Expand All @@ -109,7 +109,7 @@ Example notebook: [LLaVA Agent](/docs/notebooks/agentchat_lmm_llava)
`pyautogen<0.2` offers an experimental agent for math problem solving. Please install with the [mathchat] option to use it.

```bash
pip install "pyautogen[mathchat]<0.2"
pip install "autogen[mathchat]<0.2"
```

Example notebook: [Using MathChat to Solve Math Problems](https://github.com/autogen-ai/autogen/blob/main/notebook/agentchat_MathChat.ipynb)
Expand All @@ -119,7 +119,7 @@ Example notebook: [Using MathChat to Solve Math Problems](https://github.com/aut
To use a graph in `GroupChat`, particularly for graph visualization, please install AutoGen with the [graph] option.

```bash
pip install "pyautogen[graph]"
pip install "autogen[graph]"
```

Example notebook: [Finite State Machine graphs to set speaker transition constraints](/docs/notebooks/agentchat_groupchat_finite_state_machine)
Expand All @@ -129,5 +129,5 @@ Example notebook: [Finite State Machine graphs to set speaker transition constra
AutoGen includes support for handling long textual contexts by leveraging the LLMLingua library for text compression. To enable this functionality, please install AutoGen with the `[long-context]` option:

```bash
pip install "pyautogen[long-context]"
pip install "autogen[long-context]"
```
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,10 @@ Text compression is crucial for optimizing interactions with LLMs, especially wh
This guide introduces LLMLingua's integration with AutoGen, demonstrating how to use this tool to compress text, thereby optimizing the usage of LLMs for various applications.

:::info Requirements
Install `pyautogen[long-context]` and `PyMuPDF`:
Install `autogen[long-context]` and `PyMuPDF`:

```bash
pip install "pyautogen[long-context]" PyMuPDF
pip install "autogen[long-context]" PyMuPDF
```

For more information, please refer to the [installation guide](/docs/installation/).
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,10 @@ Why do we need to handle long contexts? The problem arises from several constrai
The `TransformMessages` capability is designed to modify incoming messages before they are processed by the LLM agent. This can include limiting the number of messages, truncating messages to meet token limits, and more.

:::info Requirements
Install `pyautogen`:
Install `autogen`:

```bash
pip install pyautogen
pip install autogen
```

For more information, please refer to the [installation guide](/docs/installation/).
Expand Down

0 comments on commit 6ef974a

Please sign in to comment.