Support for custom LLM subroutine wrappers for PromptNode #4774
recursionbane
started this conversation in
Ideas
Replies: 1 comment
-
Hey @recursionbane, I think what you're looking for is an If you need some more guidance on how to create a new one I'd suggest you join our Discord and ask the community over there, they're pretty responsive. :) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
We have a library provided to us internally that wraps an LLM call that has a non-streaming, simplified interface.
It looks something like:
This allows application developers to build synchronous programs on top of a very simple interface, choosing to allow the LLM library to handle chat history, or to manage it themselves (e.g., if they want prompt stuffing for single-shot queries).
Would it be possible to get a generic
custom_function_llm.py
extension tonodes/prompt/
to allow the PromptNode to interface to custom libraries using just Python function shims? I'm okay writing a shim (e.g.,send_prompt_haystack_wrapper()
) around the internal library to make its shape match Haystack's expectations.One possible interface:
The benefit of this would be to enable Haystack PromptNode Agents to be used in many firewalled settings.
More companies will have their own internal LLM APIs, and Haystack cannot support every one of them.
It will be easier to support a simplified custom blocking LLM subroutine interface to allow these companies to leverage Haystack to build their own libraries around their APIs with compliance checks and auditing.
Beta Was this translation helpful? Give feedback.
All reactions