Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make it easier to specify a normalization method for Chat.append_message_stream() #1621

Open
cpsievert opened this issue Aug 20, 2024 · 0 comments · May be fixed by #1626
Open

Make it easier to specify a normalization method for Chat.append_message_stream() #1621

cpsievert opened this issue Aug 20, 2024 · 0 comments · May be fixed by #1626

Comments

@cpsievert
Copy link
Collaborator

cpsievert commented Aug 20, 2024

Currently, if you want to teach Chat.append_message_stream() about response formats that it doesn't already know about, you have to register a new "message normalizer" strategy with this internal object.

Although convenient for developers and still potentially worth exporting (it's currently internal), it's a lot to ask for most users to learn and implement.

It'd be much easier if you could just pass a function to .append_message_stream() to grab the relevant content from each iteration of the stream.

For example, something like this (from #1610):

from shiny.ui._chat_normalize import BaseMessageNormalizer, message_normalizer_registry

class LangchainAgentResponseNormalizer(BaseMessageNormalizer):
    # For each chunk of a .append_message_stream()
    def normalize_chunk(self, chunk):
        return chunk["messages"][0].content

    def can_normalize_chunk(self, chunk):
        return "messages" in chunk and len(chunk["messages"]) > 0

    # For .append_message()
    def normalize(self, message):
        return message["messages"][0].content

    def can_normalize(self, message):
        return "messages" in message and len(message["messages"]) > 0

message_normalizer_registry.register(
    "langchain-agents", LangchainAgentResponseNormalizer()
)

could instead become something like:

chat.append_message_stream(response, lambda x: x["messages"][0].content)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant