You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I searched the LangChain documentation with the integrated search.
I used the GitHub search to find a similar question and didn't find it.
I am sure that this is a bug in LangChain rather than my code.
The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
Example Code
When i execute it on wsl2 in my windows:
from langchain_community.llms import Ollama
llm = Ollama(
base_url="http://127.0.0.1",
model="gemma",
timeout=100
)
prompt = 'Give me one name for a new national park with jungle terrain?'
print(llm.invoke(prompt))
Error Message and Stack Trace (if applicable)
ConnectionError: HTTPConnectionPool(host='127.0.0.1', port=80): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f67bdbca000>: Failed to establish a new connection: [Errno 111] Connection refused'))
Description
Additionally, i found that
ping localhost:
ping localhost
Pinging FeiyuCome [::1] with 32 bytes of data:
Reply from ::1: time<1ms
Reply from ::1: time<1ms
Reply from ::1: time<1ms
Reply from ::1: time<1ms
Ping statistics for ::1:
Packets: Sent = 4, Received = 4, Lost = 0 (0% loss),
Approximate round trip times in milli-seconds:
Minimum = 0ms, Maximum = 0ms, Average = 0ms
ping 127.0.0.1:
Pinging 127.0.0.1 with 32 bytes of data:
Reply from 127.0.0.1: bytes=32 time<1ms TTL=128
Reply from 127.0.0.1: bytes=32 time<1ms TTL=128
Reply from 127.0.0.1: bytes=32 time<1ms TTL=128
Reply from 127.0.0.1: bytes=32 time<1ms TTL=128
Ping statistics for 127.0.0.1:
Packets: Sent = 4, Received = 4, Lost = 0 (0% loss),
Approximate round trip times in milli-seconds:
Minimum = 0ms, Maximum = 0ms, Average = 0ms
have you tried doing http://127.0.0.1:11434 instead of just http://127.0.0.1 as the base url because Ollama listens on port 11434 and not the default port 80 when you set just http://127.0.0.1
Checked other resources
Example Code
When i execute it on wsl2 in my windows:
Error Message and Stack Trace (if applicable)
ConnectionError: HTTPConnectionPool(host='127.0.0.1', port=80): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f67bdbca000>: Failed to establish a new connection: [Errno 111] Connection refused'))
Description
Additionally, i found that
ping localhost
:ping 127.0.0.1
:How can i slove this problem?
System Info
local windows environments:
conda env list:
The text was updated successfully, but these errors were encountered: