You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Please set the start and end separators in config file of that local model and try again. start_sep and end_sep you can set in config files
In that, it needs to extract code from the generated output(string). when we change the start and end separators also it is giving the same. Because the generating output(string) is None. For the reference I have given the print statement to see the code. It is giving None. Why it is generating None. Does this code work on local model ollama mistral? Could be please help me what are the changes I need to do, to use open interpreter with local LLM.
Please attach all the logs from interpreter and ollama mistral the code format check it first is it using "'''" format for codeblocks or not and is code-llama running on same port as defined in local-model.config ? port should match and localhost please check them.
Please share your local-model.config file here aswell.
Describe the bug
A clear and concise description of what the bug is.
after giving prompt it is not generating output it is givng None Type error.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
A clear and concise description of what you expected to happen.
Screenshots
If applicable, add screenshots to help explain your problem.
Desktop (please complete the following information):
Smartphone (please complete the following information):
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: