Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bugfix Issue 259 - WIP HARD Fix using local Ollama usage #306

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

dctfor
Copy link

@dctfor dctfor commented Nov 16, 2024

Fixes this issue about not being able to use local ollama yet is a hard fix, there is work to be done to allow it to be dynamically chosen the model from the dropdown as it was taken claude and the model is set by now to "llama3.1:8b" and hardcoded the ollama to 127.0.0.1:11434

@kekePower
Copy link

It's a nice thought, but I run Ollama on another machine so it'd be better to keep it the way it is.

@dctfor
Copy link
Author

dctfor commented Nov 17, 2024

Here my recent comments on the fix for the model with ollama, now will double check about the faulty baseurl that might be ignored if the only issue was the actual model selection

#259 (comment)

@dctfor
Copy link
Author

dctfor commented Nov 17, 2024

now should be good with wathever IP adress and model you want to use

This was referenced Nov 17, 2024
@chrismahoney
Copy link
Collaborator

Taking a look at this today

@chrismahoney chrismahoney mentioned this pull request Nov 18, 2024
@mroxso
Copy link

mroxso commented Nov 18, 2024

works for me in combination with removing the .env and .env.local definition from the .dockeringore file. thats needed because otherwise ollama gets not called correctly i think. (or you can define it directly in the docker-compose.yaml file)

@chrismahoney
Copy link
Collaborator

You identified that *.local in .dockerignore is likely causing some Docker related issues, please see #329 for some details and please feel free to provide feedback there.

@joanjgm
Copy link

joanjgm commented Nov 23, 2024

now should be good with wathever IP adress and model you want to use

It doesn't work for me I still get error

RetryError [AI_RetryError]: Failed after 3 attempts. Last error: Cannot connect to API: connect ECONNREFUSED 127.0.0.1:11434
.
.
.
   cause: Error: connect ECONNREFUSED 127.0.0.1:11434
        at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1610:16)
        at TCPConnectWrap.callbackTrampoline (node:internal/async_hooks:130:17) {
      errno: -111,
      code: 'ECONNREFUSED',
      syscall: 'connect',
      address: '127.0.0.1',
      port: 11434
    },
    url: 'http://localhost:11434/api/chat',
    requestBodyValues: {
      format: undefined,
      model: 'qwen2.5-coder:7b',
      options: [Object],
      messages: [Array],
      tools: undefined
    },
    statusCode: undefined,
    responseHeaders: undefined,
    responseBody: undefined,
    isRetryable: true,
    data: undefined,
    [Symbol(vercel.ai.error)]: true,
    [Symbol(vercel.ai.error.AI_APICallError)]: true
  },
  [Symbol(vercel.ai.error)]: true,
  [Symbol(vercel.ai.error.AI_RetryError)]: true
}

Even when the connection to list my local models works just fine, and If I try to use it on openwebUI works flawlessly

image

@dustinwloring1988 dustinwloring1988 added the WIP Work In Progress label Dec 2, 2024
@VictimOfPing
Copy link

Hi I have a problem when I go to use Ollama, when I go to send it a message it gives me this error: There was an error processing your request: An error occurred.

@loki-smip
Copy link

same for me

Failed to get Ollama models: TypeError: fetch failed
at node:internal/deps/undici/undici:13484:13
at processTicksAndRejections (node:internal/process/task_queues:105:5)
at OllamaProvider.getDynamicModels (C:/blot/OTTODEV/app/lib/modules/llm/providers/ollama.ts:41:24)
at async Promise.all (index 0)
at LLMManager.updateModelList (C:/blot/OTTODEV/app/lib/modules/llm/manager.ts:71:27)
at Module.getModelList (C:/blot/OTTODEV/app/utils/constants.ts:43:10)
at Module.streamText (C:/blot/OTTODEV/app/lib/.server/llm/stream-text.ts:99:22)
at chatAction (C:/blot/OTTODEV/app/routes/api.chat.ts:101:20)
at Object.callRouteAction (C:\blot\OTTODEV\node_modules.pnpm@[email protected][email protected]\node_modules@remix-run\server-runtime\dist\data.js:36:16)
at C:\blot\OTTODEV\node_modules.pnpm@[email protected]\node_modules@remix-run\router\router.ts:4899:19
at callLoaderOrAction (C:\blot\OTTODEV\node_modules.pnpm@[email protected]\node_modules@remix-run\router\router.ts:4963:16)
at async Promise.all (index 0)
at defaultDataStrategy (C:\blot\OTTODEV\node_modules.pnpm@[email protected]\node_modules@remix-run\router\router.ts:4772:17)
at callDataStrategyImpl (C:\blot\OTTODEV\node_modules.pnpm@[email protected]\node_modules@remix-run\router\router.ts:4835:17)
at callDataStrategy (C:\blot\OTTODEV\node_modules.pnpm@[email protected]\node_modules@remix-run\router\router.ts:3992:19)
at submit (C:\blot\OTTODEV\node_modules.pnpm@[email protected]\node_modules@remix-run\router\router.ts:3755:21)
at queryImpl (C:\blot\OTTODEV\node_modules.pnpm@[email protected]\node_modules@remix-run\router\router.ts:3684:22)
at Object.queryRoute (C:\blot\OTTODEV\node_modules.pnpm@[email protected]\node_modules@remix-run\router\router.ts:3629:18)
at handleResourceRequest (C:\blot\OTTODEV\node_modules.pnpm@[email protected][email protected]\node_modules@remix-run\server-runtime\dist\server.js:402:20)
at requestHandler (C:\blot\OTTODEV\node_modules.pnpm@[email protected][email protected]\node_modules@remix-run\server-runtime\dist\server.js:156:18)
at C:\blot\OTTODEV\node_modules.pnpm@remix-run+dev@2.15.0_@remix-run[email protected][email protected]_react@[email protected]_typ_3djlhh3t6jbfog2cydlrvgreoy\node_modules@remix-run\dev\dist\vite\cloudflare-proxy-plugin.js:70:25 {
[cause]: Error: connect ECONNREFUSED ::1:11434
at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1615:16)
at TCPConnectWrap.callbackTrampoline (node:internal/async_hooks:130:17) {
errno: -4078,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '::1',
port: 11434
}
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
WIP Work In Progress
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants