Skip to content

use local model #347

Answered by sg1fan
ValValu asked this question in Q&A
Discussion options

You must be logged in to vote

I got this working locally by using the config optionapi_host_cmd = 'echo -n http://localhost:5000' while running https://github.com/oobabooga/text-generation-webui, but I'd imagine it working for any server supporting the OpenAI API.

Replies: 9 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by jackMort
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
8 participants
Converted from issue

This discussion was converted from issue #276 on December 14, 2023 12:37.