New reason of Connection refused error #3912
Replies: 5 comments
-
Hey there, @LaciliaExe! 👋 I'm Dosu, your friendly neighborhood bot here to sprinkle some magic on your bug squashing quests, answer those brain-boggling questions, and guide you on your path to becoming a legendary contributor! While we wait for a human maintainer to join us, think of me as your sidekick in the digital realm. 🚀 I'm diving into your issue as we speak and will get back to you with some insights shortly. Stay tuned! 🌟 |
Beta Was this translation helpful? Give feedback.
-
I labeled this as a "documentation" issue because, although the documentation mentions a solution for the problem "HTTPConnectionPool(host=localhost, port=11434): Max retries exceeded with URL: /cpi/chat (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f8562812c20>: Failed to establish a new connection: [Errno 111] Connection refused'))", it is incomplete. This is because there are other reasons (the issue I encountered) that can lead to a similar problem. |
Beta Was this translation helpful? Give feedback.
-
Addition: Once you do this, you can enter the address http://host.docker.internal:11434/ in the location shown in the picture below, and successfully add Ollama's LLM! |
Beta Was this translation helpful? Give feedback.
-
Thank you for the detailed explanation and for highlighting the gap in our documentation regarding Docker configurations on Linux systems. Your suggestion to modify the I'll forward your feedback to the documentation team to consider adding this information to our setup guides, ensuring future users can easily configure their environments across different operating systems, including Linux. Your contribution helps improve the experience for everyone using these services in a Dockerized environment. If there's anything more you'd like to add or another issue you encounter, please feel free to share.
|
Beta Was this translation helpful? Give feedback.
-
I deployed Dify in Docker on server A at last, which allowed me to access Ollama using http://host.docker.internal:11434/ (localhost:11434). If Dify were still in Docker on server B, this might not solved the problem. To use it in your server B, maybe you can try change something like
in the red box. |
Beta Was this translation helpful? Give feedback.
-
Self Checks
Provide a description of requested docs changes
I have read #2540 and similar issues where many people encountered Connection timeout or Connection refused errors when calling the Ollama API. Almost everyone is focusing on configuring Ollama to the port (0.0.0.0:11434) to make it accessible, but there is another issue that's been overlooked, which is that Docker containers cannot access the host's 11434 port. Although #2540 mentioned using http://host.docker.internal:11434/ as a solution, this only works for Windows and Mac OS; it is not suitable for Linux systems.
To fix this, you need to edit the docker-compose.yaml file as below image.
This means adding the statement in the red box at the subdirectory of the API, so that this problem can be solved.
I was able to identify the problem because my Ollama is deployed on Server A, while my Dify is deployed on another server, Server B. From Server B, I can use Python's requests library and browser URLs to call Ollama's API on Server A without any issues, but I'm unable to add models from Ollama to Dify, which is deployed in a Docker container on Server B. Since Ollama is clearly deployed within my network, the issue must be with Dify's configuration.
Beta Was this translation helpful? Give feedback.
All reactions