-
Notifications
You must be signed in to change notification settings - Fork 2.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error getting OpenAILike models: TypeError: Cannot read properties of undefined (reading 'map') #652
Comments
Did some further debugging. Found out that bolt is missing to add the API key, see here:
OPENAI_LIKE_API_KEY is properly set. |
Dirty fix for me in Line 410: Seems there is an issue with the following lines:
I am not so deep into the codebase, to quickly spot whats wrong. |
Same here. Exact same workaround as you. I also had to enable multiple providers, so that I could switch between them. If only OpenAILike is enabled, then the Anthropic models are loaded first (because it's alphabetically first in the list) instead of OpenAILike. |
fixed this issue in #693 . The error was that Anthropic was always the default provider even after switching the provider to OpenAILike or OpenRouter. So only the Anthropic models was loaded. Now the models are loaded from the selected provider without any issue. |
@dreher-in try setting the apikey on the UI and check, I would suggest pull the latest branch and then do the following add the url on the settings tab then add the api key on the chatbox please let Us know is the issue is resolved. need feedback to understand the exact problem |
@thecodacus hi, I'm also encountering an issue similar , could you please advise on how to access the web UI to configure the API key? |
@thecodacus Ohhh Thank you! However, I’m encountering an issue with the OpenAILike model. I followed your instructions: I first set the urlBase in the settings, and then entered my API key in the “OpenAILike API Key:” field. After refreshing the website, the model list appeared correctly, as shown below: However, when I attempt to chat with the GPT model, I receive an error message: Could you please advise on how to resolve this issue?" |
do you see any errors in terminal ? |
also the version looks old can you try the latest stable version ? |
Describe the bug
Just pulled the latest version (mine was still oTTodev, now the latest bolt.diy) and rebuilt my docker stack. Now the OpenAILike seems broken and stopped working. Before the upgrade it worked, so I assume it's not a problem on my side.
On the litellm side I can see:
litellm-1 | {"message": "litellm.proxy.proxy_server.user_api_key_auth(): Exception occured - Malformed API Key passed in. Ensure Key has
Bearerprefix. Passed in: Bearer\nRequester IP Address:192.168.178.8", "level": "ERROR", "timestamp": "2024-12-11T21:03:08.435971", "stacktrace": "Traceback (most recent call last):\n File \"/usr/local/lib/python3.11/site-packages/litellm/proxy/auth/user_api_key_auth.py\", line 569, in user_api_key_auth\n raise Exception(\nException: Malformed API Key passed in. Ensure Key has
Bearerprefix. Passed in: Bearer"}
Link to the Bolt URL that caused the error
NA, local
Steps to reproduce
Clone the current git and configure OPENAI_LIKE_API_BASE_URL and OPENAI_LIKE_API_KEY in .env.local and start it.
Expected behavior
Working OpenAILike models, here litellm
Screen Recording / Screenshot
No response
Platform
Provider Used
Litellm
Model Used
NA
Additional context
No response
The text was updated successfully, but these errors were encountered: