-
-
Notifications
You must be signed in to change notification settings - Fork 391
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: DeepSeek Coder V2 Not Working #541
Comments
This model doesn't support messages from the role |
It seems that it's not about the system message. I only know that it's complaining about the prompt template but I don't know why yet. Here is what I saw from ollama:
Here is what the stream response contains:
|
The maintainer of Ollama said that the template not yet supported error is irrelevant. I did some experiment but can't tell what was wrong. Let see if Ollama 0.1.45 will change anything. |
Ok, I will try to adjust the scope and message history I am sending, but I believe this model may be unusable as is. |
Before Reporting
What happened?
Using Ollama to run deepseek-coder-v2:
ollama run deepseek
I've setup a model in CopilotForXcode, but when I try to chat with it I get "The data couldn’t be read because it is missing."
How to reproduce the bug.
Install deepseek-coder-v2 and configure a chat model to talk to Ollama.
Relevant log output
macOS version
15
Xcode version
15
Copilot for Xcode version
0.33.4
The text was updated successfully, but these errors were encountered: