How do I set the default model to a non OpenAI backend? #130
-
For example, say I am using both Ollama and GPT4All Desktop each with a couple of models available: (gptel-make-gpt4all
"GPT4All" ;Name of your choosing
:protocol "http"
:host "localhost:4891" ;Where it's running
:models '( "mistral-7b-instruct-v0.1.Q4_0.gguf"
"mistral-7b-openorca.Q4_0.gguf"
"orca-mini-3b-gguf2-q4_0.gguf"
)) ;Available models
(gptel-make-ollama
"Ollama" ;Any name of your choosing
:host "localhost:11434" ;Where it's running
:models '("zephyr:latest" "mistral:latest") ;Installed models
:stream t) ;Stream responses
How would I specify If I set like this: (setq-default gptel-model "Ollama:zephyr:latest") Then If I set like this: (setq-default gptel-model "zephyr:latest") Then If I use |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
@nickanderson read the docs more thoroughly ... https://github.com/karthink/gptel/blob/0192fa07f3f8d2d032b71acd1fae12755207dd76/README.org#ollama
;; OPTIONAL configuration
(setq-default gptel-model "mistral:latest" ;Pick your default model
gptel-backend (gptel-make-gpt4all "Ollama" :host ...)) So, it appears that this does what I was looking for: (setq-default gptel-backend
(gptel-make-ollama
"Ollama" ;Any name of your choosing
:host "localhost:11434" ;Where it's running
:models '("zephyr:latest" "mistral:latest") ;Installed models
:stream t) ;Stream responses
)
(setq-default gptel-model "zephyr:latest") |
Beta Was this translation helpful? Give feedback.
@nickanderson read the docs more thoroughly ...
https://github.com/karthink/gptel/blob/0192fa07f3f8d2d032b71acd1fae12755207dd76/README.org#ollama
So, it appears that this does what I was looking for: