-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Docker Production Setup Fails: Env Variables Missing and Ollama Connection Error #353
Comments
> ❯ docker-compose --profile production up
> WARN[0000] The "GROQ_API_KEY" variable is not set. Defaulting to a blank string.
> WARN[0000] The "HuggingFace_API_KEY" variable is not set. Defaulting to a blank string.
> WARN[0000] The "OPEN_ROUTER_API_KEY" variable is not set. Defaulting to a blank string.
> WARN[0000] The "GOOGLE_GENERATIVE_AI_API_KEY" variable is not set. Defaulting to a blank string.
> WARN[0000] The "OLLAMA_API_BASE_URL" variable is not set. Defaulting to a blank string.
> WARN[0000] The "GROQ_API_KEY" variable is not set. Defaulting to a blank string.
> WARN[0000] The "HuggingFace_API_KEY" variable is not set. Defaulting to a blank string.
> WARN[0000] The "OPEN_ROUTER_API_KEY" variable is not set. Defaulting to a blank string.
> WARN[0000] The "GOOGLE_GENERATIVE_AI_API_KEY" variable is not set. Defaulting to a blank string.
> WARN[0000] The "OLLAMA_API_BASE_URL" variable is not set. Defaulting to a blank string.
> [+] Running 2/2
> ✔ Network boltnew-any-llm_default Created 0.0s
> ✔ Container boltnew-any-llm-bolt-ai-1 Created 0.1s @coleam00 @emcconnell @wonderwhy-er Do we pull in #222 here for now, and incur a small bit of tech debt to manage the list of env vars being added to bindings.sh? I think regardless of the solution, we should deal with this sooner than later. Also @av1155 I see you're using Arc, although that's unrelated to this issue you'll want to be on Chrome Canary while using oTToDev for the time being. |
@chrismahoney I see! Thank you for the tip, will do so! |
Also @av1155 for now, please try using the development profile for Docker instead of production. Both should be expected to work obviously, but as this is a relatively recent project (and we don't receive downstream changes from bolt) there will be more immediate updates to dev versus prod. This is more my opinion than anything else. |
Same issue here. Specifying the .env.local file was the only thing that worked.
|
After doing what @yamini said, I was able to load the .env.local variables, but, Ollama still does not work. Pnpm is the only way it will work for me so far as mentioned before. I just wrote "Hey", and that happened. It thought for a sec, and then broke. For some reason, the system attempts to use the model Log:❯ docker compose --env-file .env.local --profile development up --build
[+] Building 0.4s (12/12) FINISHED docker:desktop-linux
=> [bolt-ai-dev internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 2.01kB 0.0s
=> [bolt-ai-dev internal] load metadata for docker.io/library/node:20.18.0 0.3s
=> [bolt-ai-dev internal] load .dockerignore 0.0s
=> => transferring context: 427B 0.0s
=> [bolt-ai-dev base 1/5] FROM docker.io/library/node:20.18.0@sha256:a7a3b7ec6de4b11bb2d673b31de9d28c6da09c557ee65453672c8e4f754c23fc 0.0s
=> => resolve docker.io/library/node:20.18.0@sha256:a7a3b7ec6de4b11bb2d673b31de9d28c6da09c557ee65453672c8e4f754c23fc 0.0s
=> [bolt-ai-dev internal] load build context 0.0s
=> => transferring context: 7.43kB 0.0s
=> CACHED [bolt-ai-dev base 2/5] WORKDIR /app 0.0s
=> CACHED [bolt-ai-dev base 3/5] COPY package.json pnpm-lock.yaml ./ 0.0s
=> CACHED [bolt-ai-dev base 4/5] RUN corepack enable pnpm && pnpm install 0.0s
=> CACHED [bolt-ai-dev base 5/5] COPY . . 0.0s
=> CACHED [bolt-ai-dev bolt-ai-development 1/1] RUN mkdir -p ${WORKDIR}/run 0.0s
=> [bolt-ai-dev] exporting to image 0.0s
=> => exporting layers 0.0s
=> => exporting manifest sha256:7778322d15c9bdfa08c13ecfd508668873ce1a3dfe1963af9d69cca977f7a4f9 0.0s
=> => exporting config sha256:c91758ad11729743b58f8ebe2ab47758b056a0edaed450b13263950789f81c54 0.0s
=> => exporting attestation manifest sha256:8c803e09ac34cf1b13c993b444deba19422a27a184749688ce4c6d9012b4f346 0.0s
=> => exporting manifest list sha256:ce4a938bfc2d15f0dd50c613f7af651d0a2df4c687350b586d281436041bee7f 0.0s
=> => naming to docker.io/library/bolt-ai:development 0.0s
=> => unpacking to docker.io/library/bolt-ai:development 0.0s
=> [bolt-ai-dev] resolving provenance for metadata file 0.0s
[+] Running 1/0
✔ Container boltnew-any-llm-bolt-ai-dev-1 Recreated 0.0s
Attaching to bolt-ai-dev-1
bolt-ai-dev-1 |
bolt-ai-dev-1 | > bolt@ dev /app
bolt-ai-dev-1 | > remix vite:dev "--host" "0.0.0.0"
bolt-ai-dev-1 |
bolt-ai-dev-1 | ➜ Local: http://localhost:5173/
bolt-ai-dev-1 | ➜ Network: http://172.19.0.2:5173/
bolt-ai-dev-1 | RetryError [AI_RetryError]: Failed after 3 attempts. Last error: Cannot connect to API: connect ECONNREFUSED 127.0.0.1:11434
bolt-ai-dev-1 | at _retryWithExponentialBackoff (file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:98:13)
bolt-ai-dev-1 | at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
bolt-ai-dev-1 | at async startStep (file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:3903:13)
bolt-ai-dev-1 | at async fn (file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:3977:11)
bolt-ai-dev-1 | at async file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:256:22
bolt-ai-dev-1 | at async chatAction (/app/app/routes/api.chat.ts:64:20)
bolt-ai-dev-1 | at async Object.callRouteAction (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/data.js:37:16)
bolt-ai-dev-1 | at async /app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4612:21
bolt-ai-dev-1 | at async callLoaderOrAction (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4677:16)
bolt-ai-dev-1 | at async Promise.all (index 1)
bolt-ai-dev-1 | at async callDataStrategyImpl (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4552:17)
bolt-ai-dev-1 | at async callDataStrategy (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4041:19)
bolt-ai-dev-1 | at async submit (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:3900:21)
bolt-ai-dev-1 | at async queryImpl (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:3858:22)
bolt-ai-dev-1 | at async Object.queryRoute (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:3827:18)
bolt-ai-dev-1 | at async handleResourceRequest (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:413:20)
bolt-ai-dev-1 | at async requestHandler (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:156:18)
bolt-ai-dev-1 | at async /app/node_modules/.pnpm/@[email protected]_@[email protected][email protected][email protected][email protected]_typ_qwyxqdhnwp3srgtibfrlais3ge/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:25 {
bolt-ai-dev-1 | cause: undefined,
bolt-ai-dev-1 | reason: 'maxRetriesExceeded',
bolt-ai-dev-1 | errors: [
bolt-ai-dev-1 | APICallError [AI_APICallError]: Cannot connect to API: connect ECONNREFUSED 127.0.0.1:11434
bolt-ai-dev-1 | at postToApi (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@ai-sdk/provider-utils/dist/index.js:446:15)
bolt-ai-dev-1 | at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
bolt-ai-dev-1 | at async OllamaChatLanguageModel.doStream (/app/node_modules/.pnpm/[email protected][email protected]/node_modules/ollama-ai-provider/dist/index.js:485:50)
bolt-ai-dev-1 | at async fn (file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:3938:23)
bolt-ai-dev-1 | at async file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:256:22
bolt-ai-dev-1 | at async _retryWithExponentialBackoff (file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:86:12)
bolt-ai-dev-1 | at async startStep (file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:3903:13)
bolt-ai-dev-1 | at async fn (file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:3977:11)
bolt-ai-dev-1 | at async file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:256:22
bolt-ai-dev-1 | at async chatAction (/app/app/routes/api.chat.ts:64:20)
bolt-ai-dev-1 | at async Object.callRouteAction (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/data.js:37:16)
bolt-ai-dev-1 | at async /app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4612:21
bolt-ai-dev-1 | at async callLoaderOrAction (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4677:16)
bolt-ai-dev-1 | at async Promise.all (index 1)
bolt-ai-dev-1 | at async callDataStrategyImpl (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4552:17)
bolt-ai-dev-1 | at async callDataStrategy (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4041:19)
bolt-ai-dev-1 | at async submit (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:3900:21)
bolt-ai-dev-1 | at async queryImpl (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:3858:22)
bolt-ai-dev-1 | at async Object.queryRoute (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:3827:18)
bolt-ai-dev-1 | at async handleResourceRequest (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:413:20)
bolt-ai-dev-1 | at async requestHandler (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:156:18)
bolt-ai-dev-1 | at async /app/node_modules/.pnpm/@[email protected]_@[email protected][email protected][email protected][email protected]_typ_qwyxqdhnwp3srgtibfrlais3ge/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:25 {
bolt-ai-dev-1 | cause: [Error],
bolt-ai-dev-1 | url: 'http://127.0.0.1:11434/api/chat',
bolt-ai-dev-1 | requestBodyValues: [Object],
bolt-ai-dev-1 | statusCode: undefined,
bolt-ai-dev-1 | responseHeaders: undefined,
bolt-ai-dev-1 | responseBody: undefined,
bolt-ai-dev-1 | isRetryable: true,
bolt-ai-dev-1 | data: undefined,
bolt-ai-dev-1 | [Symbol(vercel.ai.error)]: true,
bolt-ai-dev-1 | [Symbol(vercel.ai.error.AI_APICallError)]: true
bolt-ai-dev-1 | },
bolt-ai-dev-1 | APICallError [AI_APICallError]: Cannot connect to API: connect ECONNREFUSED 127.0.0.1:11434
bolt-ai-dev-1 | at postToApi (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@ai-sdk/provider-utils/dist/index.js:446:15)
bolt-ai-dev-1 | at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
bolt-ai-dev-1 | at async OllamaChatLanguageModel.doStream (/app/node_modules/.pnpm/[email protected][email protected]/node_modules/ollama-ai-provider/dist/index.js:485:50)
bolt-ai-dev-1 | at async fn (file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:3938:23)
bolt-ai-dev-1 | at async file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:256:22
bolt-ai-dev-1 | at async _retryWithExponentialBackoff (file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:86:12)
bolt-ai-dev-1 | at async startStep (file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:3903:13)
bolt-ai-dev-1 | at async fn (file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:3977:11)
bolt-ai-dev-1 | at async file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:256:22
bolt-ai-dev-1 | at async chatAction (/app/app/routes/api.chat.ts:64:20)
bolt-ai-dev-1 | at async Object.callRouteAction (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/data.js:37:16)
bolt-ai-dev-1 | at async /app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4612:21
bolt-ai-dev-1 | at async callLoaderOrAction (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4677:16)
bolt-ai-dev-1 | at async Promise.all (index 1)
bolt-ai-dev-1 | at async callDataStrategyImpl (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4552:17)
bolt-ai-dev-1 | at async callDataStrategy (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4041:19)
bolt-ai-dev-1 | at async submit (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:3900:21)
bolt-ai-dev-1 | at async queryImpl (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:3858:22)
bolt-ai-dev-1 | at async Object.queryRoute (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:3827:18)
bolt-ai-dev-1 | at async handleResourceRequest (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:413:20)
bolt-ai-dev-1 | at async requestHandler (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:156:18)
bolt-ai-dev-1 | at async /app/node_modules/.pnpm/@[email protected]_@[email protected][email protected][email protected][email protected]_typ_qwyxqdhnwp3srgtibfrlais3ge/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:25 {
bolt-ai-dev-1 | cause: [Error],
bolt-ai-dev-1 | url: 'http://127.0.0.1:11434/api/chat',
bolt-ai-dev-1 | requestBodyValues: [Object],
bolt-ai-dev-1 | statusCode: undefined,
bolt-ai-dev-1 | responseHeaders: undefined,
bolt-ai-dev-1 | responseBody: undefined,
bolt-ai-dev-1 | isRetryable: true,
bolt-ai-dev-1 | data: undefined,
bolt-ai-dev-1 | [Symbol(vercel.ai.error)]: true,
bolt-ai-dev-1 | [Symbol(vercel.ai.error.AI_APICallError)]: true
bolt-ai-dev-1 | },
bolt-ai-dev-1 | APICallError [AI_APICallError]: Cannot connect to API: connect ECONNREFUSED 127.0.0.1:11434
bolt-ai-dev-1 | at postToApi (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@ai-sdk/provider-utils/dist/index.js:446:15)
bolt-ai-dev-1 | at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
bolt-ai-dev-1 | at async OllamaChatLanguageModel.doStream (/app/node_modules/.pnpm/[email protected][email protected]/node_modules/ollama-ai-provider/dist/index.js:485:50)
bolt-ai-dev-1 | at async fn (file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:3938:23)
bolt-ai-dev-1 | at async file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:256:22
bolt-ai-dev-1 | at async _retryWithExponentialBackoff (file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:86:12)
bolt-ai-dev-1 | at async startStep (file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:3903:13)
bolt-ai-dev-1 | at async fn (file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:3977:11)
bolt-ai-dev-1 | at async file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:256:22
bolt-ai-dev-1 | at async chatAction (/app/app/routes/api.chat.ts:64:20)
bolt-ai-dev-1 | at async Object.callRouteAction (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/data.js:37:16)
bolt-ai-dev-1 | at async /app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4612:21
bolt-ai-dev-1 | at async callLoaderOrAction (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4677:16)
bolt-ai-dev-1 | at async Promise.all (index 1)
bolt-ai-dev-1 | at async callDataStrategyImpl (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4552:17)
bolt-ai-dev-1 | at async callDataStrategy (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4041:19)
bolt-ai-dev-1 | at async submit (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:3900:21)
bolt-ai-dev-1 | at async queryImpl (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:3858:22)
bolt-ai-dev-1 | at async Object.queryRoute (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:3827:18)
bolt-ai-dev-1 | at async handleResourceRequest (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:413:20)
bolt-ai-dev-1 | at async requestHandler (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:156:18)
bolt-ai-dev-1 | at async /app/node_modules/.pnpm/@[email protected]_@[email protected][email protected][email protected][email protected]_typ_qwyxqdhnwp3srgtibfrlais3ge/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:25 {
bolt-ai-dev-1 | cause: [Error],
bolt-ai-dev-1 | url: 'http://127.0.0.1:11434/api/chat',
bolt-ai-dev-1 | requestBodyValues: [Object],
bolt-ai-dev-1 | statusCode: undefined,
bolt-ai-dev-1 | responseHeaders: undefined,
bolt-ai-dev-1 | responseBody: undefined,
bolt-ai-dev-1 | isRetryable: true,
bolt-ai-dev-1 | data: undefined,
bolt-ai-dev-1 | [Symbol(vercel.ai.error)]: true,
bolt-ai-dev-1 | [Symbol(vercel.ai.error.AI_APICallError)]: true
bolt-ai-dev-1 | }
bolt-ai-dev-1 | ],
bolt-ai-dev-1 | lastError: APICallError [AI_APICallError]: Cannot connect to API: connect ECONNREFUSED 127.0.0.1:11434
bolt-ai-dev-1 | at postToApi (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@ai-sdk/provider-utils/dist/index.js:446:15)
bolt-ai-dev-1 | at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
bolt-ai-dev-1 | at async OllamaChatLanguageModel.doStream (/app/node_modules/.pnpm/[email protected][email protected]/node_modules/ollama-ai-provider/dist/index.js:485:50)
bolt-ai-dev-1 | at async fn (file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:3938:23)
bolt-ai-dev-1 | at async file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:256:22
bolt-ai-dev-1 | at async _retryWithExponentialBackoff (file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:86:12)
bolt-ai-dev-1 | at async startStep (file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:3903:13)
bolt-ai-dev-1 | at async fn (file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:3977:11)
bolt-ai-dev-1 | at async file:///app/node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:256:22
bolt-ai-dev-1 | at async chatAction (/app/app/routes/api.chat.ts:64:20)
bolt-ai-dev-1 | at async Object.callRouteAction (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/data.js:37:16)
bolt-ai-dev-1 | at async /app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4612:21
bolt-ai-dev-1 | at async callLoaderOrAction (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4677:16)
bolt-ai-dev-1 | at async Promise.all (index 1)
bolt-ai-dev-1 | at async callDataStrategyImpl (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4552:17)
bolt-ai-dev-1 | at async callDataStrategy (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:4041:19)
bolt-ai-dev-1 | at async submit (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:3900:21)
bolt-ai-dev-1 | at async queryImpl (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:3858:22)
bolt-ai-dev-1 | at async Object.queryRoute (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/dist/router.cjs.js:3827:18)
bolt-ai-dev-1 | at async handleResourceRequest (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:413:20)
bolt-ai-dev-1 | at async requestHandler (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:156:18)
bolt-ai-dev-1 | at async /app/node_modules/.pnpm/@[email protected]_@[email protected][email protected][email protected][email protected]_typ_qwyxqdhnwp3srgtibfrlais3ge/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:25 {
bolt-ai-dev-1 | cause: Error: connect ECONNREFUSED 127.0.0.1:11434
bolt-ai-dev-1 | at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1607:16)
bolt-ai-dev-1 | at TCPConnectWrap.callbackTrampoline (node:internal/async_hooks:130:17) {
bolt-ai-dev-1 | errno: -111,
bolt-ai-dev-1 | code: 'ECONNREFUSED',
bolt-ai-dev-1 | syscall: 'connect',
bolt-ai-dev-1 | address: '127.0.0.1',
bolt-ai-dev-1 | port: 11434
bolt-ai-dev-1 | },
bolt-ai-dev-1 | url: 'http://127.0.0.1:11434/api/chat',
bolt-ai-dev-1 | requestBodyValues: {
bolt-ai-dev-1 | format: undefined,
bolt-ai-dev-1 | model: 'claude-3-5-sonnet-latest',
bolt-ai-dev-1 | options: [Object],
bolt-ai-dev-1 | messages: [Array],
bolt-ai-dev-1 | tools: undefined
bolt-ai-dev-1 | },
bolt-ai-dev-1 | statusCode: undefined,
bolt-ai-dev-1 | responseHeaders: undefined,
bolt-ai-dev-1 | responseBody: undefined,
bolt-ai-dev-1 | isRetryable: true,
bolt-ai-dev-1 | data: undefined,
bolt-ai-dev-1 | [Symbol(vercel.ai.error)]: true,
bolt-ai-dev-1 | [Symbol(vercel.ai.error.AI_APICallError)]: true
bolt-ai-dev-1 | },
bolt-ai-dev-1 | [Symbol(vercel.ai.error)]: true,
bolt-ai-dev-1 | [Symbol(vercel.ai.error.AI_RetryError)]: true
bolt-ai-dev-1 | } |
I have the same issue with local Llama models (any of them). It may be the problem with parsing Llama model name, 'cause it seems in case of parsing error default model name is used, and it is Claude's 3.5 Sonnet. It doesn't really explain its inability to connect to port 11434 though (I have just the same issue). May be it's a whole another matter. Because when I tried to change default constants to 'Ollama' and 'qwen2.5-coder:7b-32k', the connect error was still there, only model name in error message changed to qwen one. Here, look:
|
Same issues I posted about yesterday on the Ottodev Group - hopefully, we can get it figured out. Manually specifying the docker env didn't resolve mine. I still cant seem any of the Ollama models in the dropdown. |
@ColbyAttack, @jeugregg, @dimanicus, @av1155 - this issue where it still tries to use Claude when the provider is changed to Ollama is generally fixed by restarting the container. Obviously not ideal but we are still looking into this problem specifically! |
I could not get it to work. My .env.local only had these lines altered: Context size is that of the model: qwen2.5-coder:32b building production
running production
running production with --env-file
building development
running development
running development with --env-file
Restarting the docker containers via Retrieving dependencies on the bare system went fine
Yet I can not even use it outside of docker
In the web UI, I selected "Ollama", and I can see the two models I have installed (one of which is the extended context), so some part of oTToDEV is communicating with my Ollama server. |
I still can not launch oTTo through docker, but I managed to make the bare version work by undoing 12dcb8d. |
Yes we pushed a fix for this now @recallmenot! It shouldn't try to use Claude anymore when you have Ollama selected either now. |
same issue with non-docker implementation, although im not using ollama. still errors on ollama. also tailwind is bork.
|
please check the latest |
This happens with the latest build. Just tested a clean build but latest pnpm and all packages. The issue is caused by a See apps/utils/constants.ts, line 319:
I suppressed the error in my code and everything works fine. And I'll pull the latest version when an official fix/decision is made. |
thanks for the information I will look more into this tomorrow. |
Actually, the code was right there. I just uncommented it:
Not sure why or when it was commented out. |
We ended up commenting it out due to an issue it was causing, dose everything else work after this if so fell free to make a pull request so it can be merged. |
If you know how to try pull requests try this one let me know if it fixes it as it fixes a couple of issues: #526 |
I am having the same issue as what was reported here! It looks like it is ignoring the .env.local file
|
it will ignore the envfile as its in the .docker ignore list can you try this PR #1008 |
Describe the bug
I have attempted to set up the production environment using Docker, following the steps outlined in the README, as well as additional troubleshooting steps, but the setup consistently fails. My primary goal is to integrate Ollama.
Even though I create the .env.local file and input all the necessary information for OpenAI, Anthropic, and Ollama URLs, the following error persists when executing the commands
npm run dockerbuild:prod
anddocker-compose —profile production up
:Issues:
Environment Variables Not Set:
Despite creating a
.env.local
file with the required environment variables (e.g., OpenAI, Anthropic, Ollama URL), they are not being recognized during the build process. The following warning messages appear:Ollama Error in UI:
When attempting to send a message via Ollama, the UI shows the error:
"There was an error processing your request: No details were returned."
The logs include the following error message:
Additional Context:
http://127.0.0.1:11434
, and I have verified it is accessible.Full Log Output:
Link to the Bolt URL that caused the error
http://0.0.0.0:5173/
Steps to reproduce
Expected behavior
The Docker setup should:
Actual Behavior:
If I use the simple
pnpm
process, it works well, but I want to set up a Docker container for this so it's always running and I can just open a browser and use it.Screen Recording / Screenshot
No response
Platform
Additional context
No response
The text was updated successfully, but these errors were encountered: