Replies: 1 comment
-
Just for giggles I modified the docker-compose-override file to include ollama with LibreChat instead of using my own instance, loaded my models, regular chat works as expected. But still getting the same issue with Agents. Turned on debugging and captured this in case it's helpful.
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm currently trying to setup an Agent to test with, I'm using Ollama on the backend with the llama3.2-vision model. When I submit a prompt I get this error message:
In the logs I see this entry:
I've checked all of the connections and everything looks correct, nothing is not responding, wondering if I'm missing something and someone can point me in the right direction. I do have LibreChat behind a reverse proxy, although I've removed the proxy during troubleshooting and still receive the same message. I've scoured the docs as best I could and searched this forum, I found a similar issue which mentioned somethings about JWT tokens and so-on, I've generated and replaced all of the default ones and still encounter the issue.
Thanks in advance for any guidance.
Beta Was this translation helpful? Give feedback.
All reactions