Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

olama local add #118

Merged
merged 5 commits into from
Nov 16, 2024
Merged

olama local add #118

merged 5 commits into from
Nov 16, 2024

Conversation

armfuls
Copy link

@armfuls armfuls commented Oct 29, 2024

olama as standard local

@hillct
Copy link

hillct commented Nov 1, 2024

Defaulting the Ollama API URL to localhost is undesirable for h vas majority of users who have not deployed Ollama locally as it will always result in the UI throwing an error about connecting to a service h user has not specified as existing/available. Beer to retain the existing functionality of allowing the user to enable the functionality only if it exists within their infrastructure

@wonderwhy-er
Copy link
Collaborator

And, I am LM Studio user for example why OLLama?
I am planing to add "free" models as default from various providers

Also I want to "persist" user choice so that if you chose Ollama once it stays in reloads until you change.

@wonderwhy-er wonderwhy-er merged commit b7d609d into stackblitz-labs:main Nov 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants