Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error getting OpenAILike models: TypeError: Cannot read properties of undefined (reading 'map') #652

Open
dreher-in opened this issue Dec 11, 2024 · 10 comments

Comments

@dreher-in
Copy link

dreher-in commented Dec 11, 2024

Describe the bug

Just pulled the latest version (mine was still oTTodev, now the latest bolt.diy) and rebuilt my docker stack. Now the OpenAILike seems broken and stopped working. Before the upgrade it worked, so I assume it's not a problem on my side.

docker compose --profile development --env-file .env.local up --build
WARN[0000] The "TOGETHER_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "TOGETHER_API_BASE_URL" variable is not set. Defaulting to a blank string. 
WARN[0000] The "TOGETHER_API_KEY" variable is not set. Defaulting to a blank string. 
 => => naming to docker.io/library/bolt-ai:development                                                                                                               0.0s
 => [app-dev] resolving provenance for metadata file                                                                                                                 0.0s
WARN[0003] Found orphan containers ([boltnew-any-llm-bolt-ai-dev-1]) for this project. If you removed or renamed this service in your compose file, you can run this command with the --remove-orphans flag to clean it up. 
[+] Running 1/1
 ✔ Container boltnew-any-llm-app-dev-1  Recreated                                                                                                                    1.0s 
Attaching to app-dev-1
app-dev-1  | 
app-dev-1  | > bolt@ dev /app
app-dev-1  | > remix vite:dev "--host" "0.0.0.0"
app-dev-1  | 
app-dev-1  | [warn] Route discovery/manifest behavior is changing in React Router v7
app-dev-1  | ┃ You can use the `v3_lazyRouteDiscovery` future flag to opt-in early.
app-dev-1  | ┃ -> https://remix.run/docs/en/2.13.1/start/future-flags#v3_lazyRouteDiscovery
app-dev-1  | ┗
app-dev-1  | [warn] Data fetching is changing to a single fetch in React Router v7
app-dev-1  | ┃ You can use the `v3_singleFetch` future flag to opt-in early.
app-dev-1  | ┃ -> https://remix.run/docs/en/2.13.1/start/future-flags#v3_singleFetch
app-dev-1  | ┗
app-dev-1  |   ➜  Local:   http://localhost:5173/
app-dev-1  |   ➜  Network: http://192.168.0.2:5173/
app-dev-1  | Error getting OpenAILike models: TypeError: Cannot read properties of undefined (reading 'map')
app-dev-1  |     at Object.getOpenAILikeModels [as getDynamicModels] (/app/app/utils/constants.ts:400:21)
app-dev-1  |     at processTicksAndRejections (node:internal/process/task_queues:95:5)
app-dev-1  |     at async Promise.all (index 1)
app-dev-1  |     at Module.initializeModelList (/app/app/utils/constants.ts:457:9)
app-dev-1  |     at handleRequest (/app/app/entry.server.tsx:30:3)
app-dev-1  |     at handleDocumentRequest (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:340:12)
app-dev-1  |     at requestHandler (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:160:18)
app-dev-1  |     at /app/node_modules/.pnpm/@[email protected]_@[email protected][email protected][email protected][email protected]_typ_3djlhh3t6jbfog2cydlrvgreoy/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:25

On the litellm side I can see:
litellm-1 | {"message": "litellm.proxy.proxy_server.user_api_key_auth(): Exception occured - Malformed API Key passed in. Ensure Key has Bearer prefix. Passed in: Bearer\nRequester IP Address:192.168.178.8", "level": "ERROR", "timestamp": "2024-12-11T21:03:08.435971", "stacktrace": "Traceback (most recent call last):\n File \"/usr/local/lib/python3.11/site-packages/litellm/proxy/auth/user_api_key_auth.py\", line 569, in user_api_key_auth\n raise Exception(\nException: Malformed API Key passed in. Ensure Key hasBearer prefix. Passed in: Bearer"}

Link to the Bolt URL that caused the error

NA, local

Steps to reproduce

Clone the current git and configure OPENAI_LIKE_API_BASE_URL and OPENAI_LIKE_API_KEY in .env.local and start it.

Expected behavior

Working OpenAILike models, here litellm

Screen Recording / Screenshot

No response

Platform

  • OS: Debian 12
  • Browser: NA
  • Version: cf21dde

Provider Used

Litellm

Model Used

NA

Additional context

No response

@dreher-in
Copy link
Author

dreher-in commented Dec 11, 2024

Did some further debugging. Found out that bolt is missing to add the API key, see here:

2024/12/11 22:33:06 request received:
GET /models HTTP/1.1
Host: 192.168.178.250:8222
Accept: */*
Accept-Encoding: gzip, deflate
Accept-Language: *
Authorization: Bearer
Connection: keep-alive
Sec-Fetch-Mode: cors
User-Agent: node

OPENAI_LIKE_API_KEY is properly set.

@dreher-in
Copy link
Author

Dirty fix for me in app/utils/constants.ts:

Line 410: let apiKey = import.meta.env.OPENAI_LIKE_API_KEY;

Seems there is an issue with the following lines:

    if (apiKeys && apiKeys.OpenAILike) {
      apiKey = apiKeys.OpenAILike;
    }

I am not so deep into the codebase, to quickly spot whats wrong.

@bgoosmanviz
Copy link

bgoosmanviz commented Dec 12, 2024

Same here. Exact same workaround as you. I also had to enable multiple providers, so that I could switch between them. If only OpenAILike is enabled, then the Anthropic models are loaded first (because it's alphabetically first in the list) instead of OpenAILike.

@yodaljit
Copy link

fixed this issue in #693 . The error was that Anthropic was always the default provider even after switching the provider to OpenAILike or OpenRouter. So only the Anthropic models was loaded. Now the models are loaded from the selected provider without any issue.

@thecodacus
Copy link
Collaborator

@dreher-in try setting the apikey on the UI and check,

I would suggest pull the latest branch and then do the following

add the url on the settings tab
image

then add the api key on the chatbox
image

please let Us know is the issue is resolved. need feedback to understand the exact problem

@jiaohuix
Copy link

@thecodacus hi, I'm also encountering an issue similar , could you please advise on how to access the web UI to configure the API key?

@thecodacus
Copy link
Collaborator

you can access the settings menu at the left sidebar. bottom left corner
image

@jiaohuix
Copy link

jiaohuix commented Dec 17, 2024

@thecodacus Ohhh Thank you! However, I’m encountering an issue with the OpenAILike model.

I followed your instructions: I first set the urlBase in the settings, and then entered my API key in the “OpenAILike API Key:” field. After refreshing the website, the model list appeared correctly, as shown below:
image

However, when I attempt to chat with the GPT model, I receive an error message:
image

Could you please advise on how to resolve this issue?"

@thecodacus
Copy link
Collaborator

do you see any errors in terminal ?

@thecodacus
Copy link
Collaborator

also the version looks old can you try the latest stable version ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants