Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

when using the local model using ollam mistral. it is given error for code. its is not extract code from generated output. #17

Open
SaiSravanthi73 opened this issue Jun 17, 2024 · 3 comments

Comments

@SaiSravanthi73
Copy link

Screenshot 2024-06-17 141410
Describe the bug
A clear and concise description of what the bug is.
after giving prompt it is not generating output it is givng None Type error.

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS] - Windows 10
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Smartphone (please complete the following information):

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]
  • Browser [e.g. stock browser, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

@haseeb-heaven
Copy link
Owner

Please set the start and end separators in config file of that local model and try again.
start_sep and end_sep you can set in config files

@SaiSravanthi73
Copy link
Author

Please set the start and end separators in config file of that local model and try again. start_sep and end_sep you can set in config files

In that, it needs to extract code from the generated output(string). when we change the start and end separators also it is giving the same. Because the generating output(string) is None. For the reference I have given the print statement to see the code. It is giving None. Why it is generating None. Does this code work on local model ollama mistral? Could be please help me what are the changes I need to do, to use open interpreter with local LLM.

@haseeb-heaven
Copy link
Owner

haseeb-heaven commented Jun 18, 2024

Please attach all the logs from interpreter and ollama mistral the code format check it first is it using "'''" format for codeblocks or not and is code-llama running on same port as defined in local-model.config ? port should match and localhost please check them.

Please share your local-model.config file here aswell.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants