diff --git a/docs/compilation/convert_weights.rst b/docs/compilation/convert_weights.rst index c0b9ea2fbb..1bca2de439 100644 --- a/docs/compilation/convert_weights.rst +++ b/docs/compilation/convert_weights.rst @@ -102,8 +102,8 @@ See :ref:`compile-command-specification` for specification of ``gen_config``. ``dist/RedPajama-INCITE-Instruct-3B-v1-q4f16_1-MLC/mlc-chat-config.json`` (checkout :ref:`configure-mlc-chat-json` for more detailed instructions). You can also simply use the default configuration. - `conversation_template.py `__ - contains a full list of conversation templates that MLC provides. If the model you are adding + `conversation_template `__ + directory contains a full list of conversation templates that MLC provides. If the model you are adding requires a new conversation template, you would need to add your own. Follow `this PR `__ as an example. However, adding your own template would require you :ref:`build mlc_llm from source ` in order for it diff --git a/docs/deploy/mlc_chat_config.rst b/docs/deploy/mlc_chat_config.rst index d5e5628fc2..4222a2ccd8 100644 --- a/docs/deploy/mlc_chat_config.rst +++ b/docs/deploy/mlc_chat_config.rst @@ -110,7 +110,7 @@ supported conversation templates: - ``phi-2`` - ... -Please refer to `conversation_template.py `_ for the full list of supported templates and their implementations. +Please refer to `conversation_template `_ directory for the full list of supported templates and their implementations. Below is a generic structure of a JSON conversation configuration (we use vicuna as an example):