From b68f7b7bebdb431a874592f40f7dba3c2d4d7303 Mon Sep 17 00:00:00 2001 From: Kaneki Date: Wed, 13 Nov 2024 13:25:55 +0800 Subject: [PATCH] [docs] Updated conversation template doc (#3020) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit [docs] Updated path pointers in both docs files to point to the `conversation_template` directory instead of linking to the non-existent `conversation_template.py` file. Co-authored-by: 涧波 --- docs/compilation/convert_weights.rst | 4 ++-- docs/deploy/mlc_chat_config.rst | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/compilation/convert_weights.rst b/docs/compilation/convert_weights.rst index c0b9ea2fbb..1bca2de439 100644 --- a/docs/compilation/convert_weights.rst +++ b/docs/compilation/convert_weights.rst @@ -102,8 +102,8 @@ See :ref:`compile-command-specification` for specification of ``gen_config``. ``dist/RedPajama-INCITE-Instruct-3B-v1-q4f16_1-MLC/mlc-chat-config.json`` (checkout :ref:`configure-mlc-chat-json` for more detailed instructions). You can also simply use the default configuration. - `conversation_template.py `__ - contains a full list of conversation templates that MLC provides. If the model you are adding + `conversation_template `__ + directory contains a full list of conversation templates that MLC provides. If the model you are adding requires a new conversation template, you would need to add your own. Follow `this PR `__ as an example. However, adding your own template would require you :ref:`build mlc_llm from source ` in order for it diff --git a/docs/deploy/mlc_chat_config.rst b/docs/deploy/mlc_chat_config.rst index d5e5628fc2..4222a2ccd8 100644 --- a/docs/deploy/mlc_chat_config.rst +++ b/docs/deploy/mlc_chat_config.rst @@ -110,7 +110,7 @@ supported conversation templates: - ``phi-2`` - ... -Please refer to `conversation_template.py `_ for the full list of supported templates and their implementations. +Please refer to `conversation_template `_ directory for the full list of supported templates and their implementations. Below is a generic structure of a JSON conversation configuration (we use vicuna as an example):