-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for users to specify custom request settings, model and optionally provider specific #14535
base: master
Are you sure you want to change the base?
Conversation
fixed #14503 Signed-off-by: Jonas Helming <[email protected]>
fixed #14526 Signed-off-by: Jonas Helming <[email protected]>
@dhuebner Could you check the Ollama adaptations please? |
Signed-off-by: Jonas Helming <[email protected]>
Signed-off-by: Jonas Helming <[email protected]>
See this for docu: eclipse-theia/theia-website#662 |
@JonasHelming |
@JonasHelming @sdirix |
@dhuebner I also already thought about this, would you mind creating a new ticket and mark me there? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the work ❤️ ! I found some inconsistencies which should be fixed before we merge.
packages/ai-llamafile/src/browser/llamafile-frontend-application-contribution.ts
Outdated
Show resolved
Hide resolved
packages/ai-llamafile/src/browser/llamafile-frontend-application-contribution.ts
Outdated
Show resolved
Hide resolved
packages/ai-llamafile/src/browser/llamafile-frontend-application-contribution.ts
Show resolved
Hide resolved
packages/ai-llamafile/src/browser/llamafile-frontend-application-contribution.ts
Outdated
Show resolved
Hide resolved
packages/ai-openai/src/browser/openai-frontend-application-contribution.ts
Outdated
Show resolved
Hide resolved
packages/ai-openai/src/browser/openai-frontend-application-contribution.ts
Outdated
Show resolved
Hide resolved
packages/ai-openai/src/browser/openai-frontend-application-contribution.ts
Show resolved
Hide resolved
packages/ai-openai/src/browser/openai-frontend-application-contribution.ts
Outdated
Show resolved
Hide resolved
packages/ai-openai/src/node/openai-language-models-manager-impl.ts
Outdated
Show resolved
Hide resolved
…l.ts Co-authored-by: Stefan Dirix <[email protected]>
Signed-off-by: Jonas Helming <[email protected]>
Signed-off-by: Jonas Helming <[email protected]>
Signed-off-by: Jonas Helming <[email protected]>
packages/ai-openai/src/browser/openai-frontend-application-contribution.ts
Outdated
Show resolved
Hide resolved
Signed-off-by: Jonas Helming <[email protected]>
Signed-off-by: Jonas Helming <[email protected]>
Thank you for the great review. I tried to address all comments (in individual commits) and tested all providers again with the final state. |
What it does
Add support for users to specify custom request settings, model and optionally provider specific.
The reason for making them provider specific is that providers have different options and sometimes even different names for the same (see below)
How to test
Add the settings below and adapt it to models you have.
Qwen/Qwen2.5-Coder-32B-Instruct is always "warm" on serverless Hugginface
Starcoder2.3B can be downloaded here: https://huggingface.co/Mozilla/starcoder2-llamafile/tree/main
gemma2 can be directly downloaded with ollama (ollama serve and ollama run gemma2)
Two good test cases:
Follow-ups
This should be the last thing we add to the provider layer before we refactor it all together:
Review checklist
Reminder for reviewers