Skip to content

yangyaofei/dify-vllm-provider

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

A vllm-openai provider for Dify

The vllm openAI compatible server can make CFG(Classifier-Free Guidance) with outlines or lm-format-enforcer with Extra Parameters like guided_json, guided_regex. The openai compatible provider in Dify can not do this, but like gpt-4o can do same thing with json flag.

This provider provide a vllm provider upon Dify's openai compatible provider with guided_json, guided_regex, guided_grammar.

Dynamic Request guided

The guided param may various in same workflow, but Dify doesn't give a extra param in LLM to pass it.

This comes little hacky, when the prompt has 2nd part and it's assistant, this provider will use the assistant part as guided param. If it can be parsed as json.

The structure of param can be found in GuidedParam class.

Use in Docker

add volume in docker-compose.yaml:

api:
  image: langgenius/dify-api:XXX
  # add config in volumes
  volumes:
    # add this
    - path_to_project/vllm:/app/api/core/model_runtime/model_providers/vllm in 

restart docker-compose , you will see it

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages