-
Notifications
You must be signed in to change notification settings - Fork 160
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Batch scan support #59
Comments
Hey @gustavz , thanks for reaching out. RAG is the scope of the next release introducing more examples and features related to that. One of them can be a support of the batch. Probably one thing we should explore is latency because there are a few ways to run it:
I will keep you updated on the progress |
In our Slack, you can engage in discussions, share your feedback and get latest updates. |
Any more information? I work on RAG now, try to introduce |
Maybe something like this(add
|
Hey @vincent-pli , In the next version, we are planning to kick off the refactoring of inputs and outputs. |
Hi do you have support for scanning text / prompts in batches?
I am thinking about something like presidios
BatchAnalyzerEngine
.Right now llm-guard can efficiently only be used for single prompts or outputs, but can not be used to scan whole datasets (e.g for RAG). Do you plan to add support for those use cases?
The text was updated successfully, but these errors were encountered: