Skip to content

Commit

Permalink
docs: Update docstring for MessagesPlaceholder (langchain-ai#19601)
Browse files Browse the repository at this point in the history
Update to docstring for MessagesPlaceholder so that it shows helpful
information in code editors. E.g. VS Code as shown below.


<img width="587" alt="Screenshot 2024-03-26 at 17 18 58"
src="https://github.com/langchain-ai/langchain/assets/45722942/8f49d09f-ed8d-4f61-a9d4-3611dbe9c9c5">

---------

Co-authored-by: Bagatur <[email protected]>
  • Loading branch information
jhicks2306 and baskaryan authored Mar 26, 2024
1 parent 7c2578b commit 087823a
Showing 1 changed file with 56 additions and 1 deletion.
57 changes: 56 additions & 1 deletion libs/core/langchain_core/prompts/chat.py
Original file line number Diff line number Diff line change
Expand Up @@ -96,12 +96,67 @@ def __add__(self, other: Any) -> ChatPromptTemplate:


class MessagesPlaceholder(BaseMessagePromptTemplate):
"""Prompt template that assumes variable is already list of messages."""
"""Prompt template that assumes variable is already list of messages.
A placeholder which can be used to pass in a list of messages.
Direct usage:
.. code-block:: python
from langchain_core.prompts import MessagesPlaceholder
prompt = MessagesPlaceholder("history")
prompt.format_messages() # raises KeyError
prompt = MessagesPlaceholder("history", optional=True)
prompt.format_messages() # returns empty list []
prompt.format_messages(
history=[
("system", "You are an AI assistant."),
("human", "Hello!"),
]
)
# -> [
# SystemMessage(content="You are an AI assistant."),
# HumanMessage(content="Hello!"),
# ]
Building a prompt with chat history:
.. code-block:: python
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
prompt = ChatPromptTemplate.from_messages(
[
("system", "You are a helpful assistant."),
MessagesPlaceholder("history"),
("human", "{question}")
]
)
prompt.invoke(
{
"history": [("human", "what's 5 + 2"), ("ai", "5 + 2 is 7")],
"question": "now multiply that by 4"
}
)
# -> ChatPromptValue(messages=[
# SystemMessage(content="You are a helpful assistant."),
# HumanMessage(content="what's 5 + 2"),
# AIMessage(content="5 + 2 is 7"),
# HumanMessage(content="now multiply that by 4"),
# ])
"""

variable_name: str
"""Name of variable to use as messages."""

optional: bool = False
"""If True format_messages can be called with no arguments and will return an empty
list. If False then a named argument with name `variable_name` must be passed
in, even if the value is an empty list."""

@classmethod
def get_lc_namespace(cls) -> List[str]:
Expand Down

0 comments on commit 087823a

Please sign in to comment.