-
Notifications
You must be signed in to change notification settings - Fork 59.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
通义百炼的VL模型如何调用呢,和qwen是不同的path #5330
Comments
Please follow the issue template to update title and description of your issue. |
Title: How to call the VL model of Tongyi Bailian? It is a different path from qwen 🥰 Description of requirementsThe model has been configured in variables +qwen-vl-plus@Alibaba, +qwen-vl-max@Alibaba The common qwen model calling path is v1/services/aigc/text-generation/generation I can't seem to find a way to specify the URL and path for a custom or specific model 🧐 SolutionCan you describe how to make a specific model follow a specific URL and path? 📝 Supplementary informationNo response |
mark |
1 similar comment
mark |
解决了吗 |
Is it solved? |
🥰 需求描述
已在变量中配置好模型+qwen-vl-plus@Alibaba,+qwen-vl-max@Alibaba
能正常显示模型,但是path不对
普通qwen模型调用path是v1/services/aigc/text-generation/generation
qwen VL的path是v1/services/aigc/multimodal-generation/generation
我好像找不到办法指定自定义或特定模型的URL和path
🧐 解决方案
可以描述一下,如何让特定的模型,走特定的URL和path吗?
📝 补充信息
No response
The text was updated successfully, but these errors were encountered: