Model-centric LLM interface #643
andreibondarev
started this conversation in
Ideas
Replies: 1 comment
-
I like this a lot 😄 Trying to think about what the interface would look like with this. The goal being to reduce user friction to make it to the point of use faster. Maybe something like: # default provider to instantiate `Langchain::LLM::Anthropic` with env vars?
Langchain::LLM.get("claude-3").chat(...)
# or, pre-configure the LLM provider as usual
aws_bedrock_provider = Langchain::LLM::AwsBedrock.new(...)
Langchain::LLM.get("command-4-plus", provider: aws_bedrock_provider).chat(...) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Description
When discussing the LLM interfaces with @codenamev, it's been suggested that we switch to a model-centric interface. This puts the model as the first-class citizen and the "LLM provider" as the 2nd class citizen.
The proposed interface would start with instantiating a model and then selecting the LLM provider, or the "deployment target". For example:
Beta Was this translation helpful? Give feedback.
All reactions