Skip to content

Commit

Permalink
docs: fix chat figures
Browse files Browse the repository at this point in the history
  • Loading branch information
lbeurerkellner committed Oct 13, 2023
1 parent 66b7573 commit b27dc19
Show file tree
Hide file tree
Showing 3 changed files with 15 additions and 36 deletions.
11 changes: 4 additions & 7 deletions docs/docs/lib/chat/internal.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,13 +5,10 @@ order: 3

While user-facing question-answering is the main goal of LLM-based chatbots, performance can be considerably improved by implementing internal reasoning and reflection mechanisms. In this chapter, we will discuss the implementation of such mechanisms in LMQL Chat.

```{figure} https://github.com/eth-sri/lmql/assets/17903049/cb609b5c-8984-414a-a3b6-b3fa6f8ab6bb
:name: lmql-chat
:alt: A chatbot with internal reasoning capabilities.
:align: center
A chatbot with internal reasoning capabilities.
```
<figure align="center" style="width: 100%; margin: auto;" alt="Screenshot of the model dropdown in the playground">
<img style="min-height: 100pt" src="https://github.com/eth-sri/lmql/assets/17903049/cb609b5c-8984-414a-a3b6-b3fa6f8ab6bb" alt="Screenshot of the model dropdown in the playground"/>
<figcaption>A chatbot that relies on internal reasoning.</figcaption>
</figure>

Building on the simple chat application implemented in [](./overview.md), we extend the chat loop as follows:

Expand Down
17 changes: 4 additions & 13 deletions docs/docs/lib/chat/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,19 +65,10 @@ lmql chat chat.lmql

Once the server is running, you can access the chatbot at the provided local URL.

```{toctree}
:hidden:
./chat/overview
```

```{figure} https://github.com/eth-sri/lmql/assets/17903049/334e9ab4-aab8-448d-9dc0-c53be8351e27
:name: lmql-chat
:alt: A simple chatbot using the LMQL chat UI
:align: center
A simple chatbot using the LMQL Chat UI.
```
<figure align="center" style="width: 100%; margin: auto;" alt="Screenshot of the model dropdown in the playground">
<img style="min-height: 100pt" src="https://github.com/eth-sri/lmql/assets/17903049/334e9ab4-aab8-448d-9dc0-c53be8351e27" alt="Screenshot of the model dropdown in the playground"/>
<figcaption>A simple chatbot using the LMQL Chat UI.</figcaption>
</figure>

In this interface, you can interact with your chatbot by typing into the input field at the bottom of the screen. The chatbot will then respond to your input, while also considering the system prompt that you provide in your program. On the right, you can inspect the full internal prompt of your program, including the generated prompt statements and the model output. This allows you at all times, to understand what exact input the model received and how it responded to it.

Expand Down
23 changes: 7 additions & 16 deletions docs/docs/lib/chat/serving.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,23 +15,14 @@ To locally serve an LMQL chat endpoint and user interface, simply run the follow
lmql chat <path-to-lmql-file>.lmql
```

This will serve a web interface on `http://localhost:8089`, and automatically open it in your default browser. You can now start chatting with your custom LMQL chat application. The internal trace on the right-hand side always displays the complete current prompt, reflecting the current state of your chat application.
This will serve a web interface on `http://localhost:8089`, and automatically open it in your default browser. You can now start chatting with your custom LMQL chat application. The internal trace on the right-hand side (shown below), always displays the complete (conversational) prompt, reflecting the current state of your chat application.

Note that changing the `.lmql` file will **not** automatically reload the server, so you will have to restart the server manually to see the changes.

```{toctree}
:hidden:
./chat/overview
```

```{figure} https://github.com/eth-sri/lmql/assets/17903049/334e9ab4-aab8-448d-9dc0-c53be8351e27
:name: lmql-chat
:alt: A simple chatbot using the LMQL chat UI
:align: center
A simple chatbot using the LMQL Chat UI.
```
<figure align="center" style="width: 100%; margin: auto;" alt="A simple chatbot using the LMQL Chat UI.">
<img style="min-height: 100pt" src="https://github.com/eth-sri/lmql/assets/17903049/334e9ab4-aab8-448d-9dc0-c53be8351e27" alt="A simple chatbot using the LMQL Chat UI."/>
<br/><figcaption>A simple chatbot launched via <code>lmql chat</code>.</figcaption>
</figure>

## Using `chatserver`

Expand All @@ -49,12 +40,12 @@ Note that when passing a query function directly, you have to always provide a `

Chat relies on a [decorator-based](../../language/decorators.md) output streaming. More specifically, only model output variables that are annotated as `@message` are streamed and shown to the user in the chat interface. This allows for a clean separation of model output and chat output, and eneables hidden/internal reasoning.

To use `@message` with your [custom output writer](../output.ipynb), make sure to inherit from `lmql.lib.chat`'s `ChatMessageOutputWriter`, which offers additional methods for specifically handling and streaming `@message` variables.
To use `@message` with your [custom output writer](../output.html), make sure to inherit from `lmql.lib.chat`'s `ChatMessageOutputWriter`, which offers additional methods for specifically handling and streaming `@message` variables.

## More Advanced Usage

For more advanced serving scenarios, e.g. when integrating Chat into your own web applications, please refer to the very minimal implementation of `chatserver` in [`src/lmql/ui/chat/__init__.py`](https://github.com/eth-sri/lmql/blob/main/src/lmql/ui/chat/__init__.py). This implementation is very minimal and can be easily adapted to your own needs and infrastructure. The corresponding web UI is implemented in [`src/lmql/ui/chat/assets/`](https://github.com/eth-sri/lmql/blob/main/src/lmql/ui/chat/assets/) and offers a good starting point for your own implementation and UI adaptations on the client side.

For other forms of output streaming e.g. via HTTP or SSE, see also the chapter on [Output Streaming](../output.ipynb)
For other forms of output streaming e.g. via HTTP or SSE, see also the chapter on [Output Streaming](../output.html)

**Disclaimer**: The LMQL chat server is a simple code template that does not include any security features, authentication or cost control. It is intended for local development and testing only, and should not be used as-is in production environments. Before deploying your own chat application, make sure to implement the necessary security measures, cost control and authentication mechanisms.

0 comments on commit b27dc19

Please sign in to comment.