Using Marvin with Mistral-7B-Instruct via llama.cpp #829
Unanswered
lostmygithubaccount
asked this question in
Q&A
Replies: 1 comment 4 replies
-
hey @lostmygithubaccount - i think there's 2 big things here (from my exploration with mistral, which has been admittedly limited compared to what I would prefer)
can you say more about what you've done here:
so I can spend some time with this myself sometime? |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Rough steps to get here:
make
in llama.cpp to build the server executablellama.cpp/server
pointing at the ggufThis results in an OpenAI-compatible server on http://localhost:8000
Setup Marvin to point there:
Unfortunately (and surprisingly), only classification seems to work:
This would be awesome to get working with all of Marvin. I'm not sure where to start
Beta Was this translation helpful? Give feedback.
All reactions