Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Offline enviroment, how to run with local Ollama models? #830

Open
TRYOKETHEPEN opened this issue Dec 19, 2024 · 1 comment
Open

Offline enviroment, how to run with local Ollama models? #830

TRYOKETHEPEN opened this issue Dec 19, 2024 · 1 comment

Comments

@TRYOKETHEPEN
Copy link

Describe the bug

Run with docker raise error: read ECONNRESET
The error relates to network problem while access open router model, but i didnt set its API
I only want to use local ollama, what should i do

Link to the Bolt URL that caused the error

\

Steps to reproduce

  1. My server is offline, only can access our internal registry and github
  2. My server's feature: Ubuntu20.04, x64
  3. cd /home/containers
  4. git clone https://github.com/stackblitz-labs/bolt.diy
  5. cd bolt.diy
  6. nano Dockerfile:
  7. Modify the first few lines as: (xxxxx is for secrecy)
# change to our internal registry
ARG BASE=xxxxx/node:20.18.0
FROM ${BASE} AS base

# make sure npm registry success
USER root
WORKDIR /app

# Install dependencies (this step is cached as long as the dependencies don't change)
COPY package.json pnpm-lock.yaml ./

# change to our internal registry
RUN npm config set registry http://xxxxx/npm-official/
# corepack enable pnpm does not obey the registry, so change it to npm install
RUN npm install -g pnpm
RUN pnpm install
# RUN corepack enable pnpm && pnpm install
  1. npm run dockerbuild
  2. docker-compose --profile development up
  3. Access localhost:5173, raise Error:
(base) root@AIServer:/home/containers/bolt.diy# docker-compose --profile development up
WARN[0000] The "GROQ_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "HuggingFace_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "OPENAI_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "ANTHROPIC_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "OPEN_ROUTER_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "GOOGLE_GENERATIVE_AI_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "OLLAMA_API_BASE_URL" variable is not set. Defaulting to a blank string. 
WARN[0000] The "TOGETHER_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "TOGETHER_API_BASE_URL" variable is not set. Defaulting to a blank string. 
WARN[0000] The "GROQ_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "HuggingFace_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "OPENAI_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "ANTHROPIC_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "OPEN_ROUTER_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "GOOGLE_GENERATIVE_AI_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "OLLAMA_API_BASE_URL" variable is not set. Defaulting to a blank string. 
WARN[0000] The "TOGETHER_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "TOGETHER_API_BASE_URL" variable is not set. Defaulting to a blank string. 
WARN[0000] Found orphan containers ([bolt-ai]) for this project. If you removed or renamed this service in your compose file, you can run this command with the --remove-orphans flag to clean it up. 
[+] Running 1/0
 ? Container boltdiy-app-dev-1  Created                                                                                                                                                         0.0s 
Attaching to app-dev-1
app-dev-1  | 
app-dev-1  | > [email protected] dev /app
app-dev-1  | > node pre-start.cjs  && remix vite:dev "--host" "0.0.0.0"
app-dev-1  | 
app-dev-1  | 
app-dev-1  | ★═══════════════════════════════════════★
app-dev-1  |           B O L T . D I Y
app-dev-1  |          ??  Welcome  ??
app-dev-1  | ★═══════════════════════════════════════★
app-dev-1  | 
app-dev-1  | ?? Current Commit Version: 50e677878446f622531123b19912f38e8246afbd
app-dev-1  | ★═══════════════════════════════════════★
app-dev-1  | [warn] Data fetching is changing to a single fetch in React Router v7
app-dev-1  | ┃ You can use the `v3_singleFetch` future flag to opt-in early.
app-dev-1  | ┃ -> https://remix.run/docs/en/2.13.1/start/future-flags#v3_singleFetch
app-dev-1  | ┗
app-dev-1  |   ?  Local:   http://localhost:5173/
app-dev-1  |   ?  Network: http://10.1.8.2:5173/
app-dev-1  | TypeError: fetch failed
app-dev-1  |     at node:internal/deps/undici/undici:13185:13
app-dev-1  |     at processTicksAndRejections (node:internal/process/task_queues:95:5)
app-dev-1  |     at Object.getOpenRouterModels [as getDynamicModels] (/app/app/utils/constants.ts:574:5)
app-dev-1  |     at async Promise.all (index 2)
app-dev-1  |     at Module.initializeModelList (/app/app/utils/constants.ts:654:7)
app-dev-1  |     at handleRequest (/app/app/entry.server.tsx:30:3)
app-dev-1  |     at handleDocumentRequest (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:340:12)
app-dev-1  |     at requestHandler (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:160:18)
app-dev-1  |     at /app/node_modules/.pnpm/@[email protected]_@[email protected][email protected][email protected][email protected]_typ_3djlhh3t6jbfog2cydlrvgreoy/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:25 {
app-dev-1  |   [cause]: Error: read ECONNRESET
app-dev-1  |       at TLSWrap.onStreamRead (node:internal/stream_base_commons:218:20)
app-dev-1  |       at TLSWrap.callbackTrampoline (node:internal/async_hooks:130:17) {
app-dev-1  |     errno: -104,
app-dev-1  |     code: 'ECONNRESET',
app-dev-1  |     syscall: 'read'
app-dev-1  |   }
app-dev-1  | }
app-dev-1  | TypeError: fetch failed
app-dev-1  |     at node:internal/deps/undici/undici:13185:13
app-dev-1  |     at processTicksAndRejections (node:internal/process/task_queues:95:5)
app-dev-1  |     at Object.getOpenRouterModels [as getDynamicModels] (/app/app/utils/constants.ts:574:5)
app-dev-1  |     at async Promise.all (index 2)
app-dev-1  |     at Module.initializeModelList (/app/app/utils/constants.ts:654:7)
app-dev-1  |     at handleRequest (/app/app/entry.server.tsx:30:3)
app-dev-1  |     at handleDocumentRequest (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:390:14)
app-dev-1  |     at requestHandler (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:160:18)
app-dev-1  |     at /app/node_modules/.pnpm/@[email protected]_@[email protected][email protected][email protected][email protected]_typ_3djlhh3t6jbfog2cydlrvgreoy/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:25 {
app-dev-1  |   [cause]: Error: read ECONNRESET
app-dev-1  |       at TLSWrap.onStreamRead (node:internal/stream_base_commons:218:20)
app-dev-1  |       at TLSWrap.callbackTrampoline (node:internal/async_hooks:130:17) {
app-dev-1  |     errno: -104,
app-dev-1  |     code: 'ECONNRESET',
app-dev-1  |     syscall: 'read'
app-dev-1  |   }
app-dev-1  | }

Expected behavior

\

Screen Recording / Screenshot

as above

Platform

as above

Provider Used

Ollama

Model Used

\

Additional context

No response

@TRYOKETHEPEN
Copy link
Author

TRYOKETHEPEN commented Dec 19, 2024

[1] The error above is trying to access https://openrouter.ai/api/v1/models
[2] I download the models.json file at online enviroment
[3] Then I changed the constants.ts as:

async function getOpenRouterModels(): Promise<ModelInfo[]> {
  // const data: OpenRouterModelsResponse = await (
  //   await fetch('https://openrouter.ai/api/v1/models', {
  //     headers: {
  //       'Content-Type': 'application/json',
  //     },
  //   })
  // ).json();

  const data: OpenRouterModelsResponse = JSON.parse(fs.readFileSync('/home/containers/bolt.diy/models.json', 'utf8'));

  return data.data
    .sort((a, b) => a.name.localeCompare(b.name))
    .map((m) => ({
      name: m.id,
      label: `${m.name} - in:$${(m.pricing.prompt * 1_000_000).toFixed(
        2,
      )} out:$${(m.pricing.completion * 1_000_000).toFixed(2)} - context ${Math.floor(m.context_length / 1000)}k`,
      provider: 'OpenRouter',
      maxTokenAllowed: 8000,
    }));
}

[4] Now I can open the localhost:5173 web page after npm run dev (Docker should be considered after local running ok)
[5] But I wonder how to load my local ollama model (http://localhost:11434)
[6] I've checked my local ollama model is correct via http://localhost:11434/v1/models
4

[7] I've tried add "http://localhost:11434" to Settings-Providers-Ollama-Base URL, or to .env.local file, but it didnt work.
1
2
3

@TRYOKETHEPEN TRYOKETHEPEN changed the title Offline enviroment, how to run with docker Offline enviroment, how to run with local Ollama models? Dec 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant