Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change chat custom normalizer error message to point to correct function #1754

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

wch
Copy link
Collaborator

@wch wch commented Nov 2, 2024

Previously, the error message said this:

Consider registering a custom normalizer via shiny.ui._chat_types.registry.register()

But the function shiny.ui._chat_types.registry.register() doesn't exist. I changed it to what I think is the correct function, shiny.ui._chat_normalize.message_normalizer_registry.register(), but please confirm that it's right.

@wch wch requested a review from cpsievert November 2, 2024 13:31
@cpsievert
Copy link
Collaborator

cpsievert commented Nov 4, 2024

Thanks. Out of curiosity, did you have to implement a new strategy, or is #1755 enough to accomplish what you need?

@wch
Copy link
Collaborator Author

wch commented Nov 4, 2024

#1755 was enough for my purposes. However, I should mention that the input message format when using prompt caching is different from the usual format. Instead of:

{ "role": "user", "content": "What color is the sky?" }

The input needs this format:

{
  "role": "user",
  "content": [
    {
      "type": "text",
      "text": "What color is the sky?",
      "cache_control": {"type": "ephemeral"}
    }
  ]
}

So for conversations, after calling `chat.messages(format="anthropic")), the result needs to be transformed to the new format before sending it to the API endpoint. And the system prompt needs to be transformed in the same way.

I wrote a function to do the transformation here, but note that it won't cover all cases -- for example, if you use non-text inputs, it will throw an error. It is sufficient for the my use case though:

https://github.com/posit-dev/shiny-assistant/blob/f383675d60c1e709d3c1bdd66962d7bc2d51f4e0/app.py#L552-L597

So we should think about how to make prompt caching easier to use.

@cpsievert
Copy link
Collaborator

cpsievert commented Nov 4, 2024

Good to know. As a heads up, we're soon going to promote chatlas as the way to manage the conversation state (and generate responses) instead of chat.message(format="..."). That said, we don't currently have a good way to take advantage of Anthropic's approach to prompt caching, so probably worth sticking with chat.message(format="anthropic") in shiny assistant, at least for now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants