Prompt for LLM #94
Replies: 8 comments
-
Yes this is something I can get to this week. Fairly straightforward task. Would you be interested in opening a PR for this perhaps? |
Beta Was this translation helpful? Give feedback.
-
Great!! Look, I'm a regular user who saw a video on YouTube and downloaded the program, I don't know what "opening a PR" means. But I think the program is very promising, I'm testing it. I arranged to use it with lmstudio and so it was faster than using lm natively. I tested some embeds. One difficulty I have is that it is slow with large texts, the embed processes a lot. Especially when we try to make a change. I know that the embedes are the core of the program and I love them. I always see what they are sending to llm through lmstudio. I am from Brazil. |
Beta Was this translation helpful? Give feedback.
-
Ah gotcha lol. A PR is a pull request which is how contributions are made to an open source project :) What kind of hardware do you have on your computer? Trying to understand why it is slow... |
Beta Was this translation helpful? Give feedback.
-
Intel(R) Core(TM) i7-7700 CPU @ 3.60GHz - Ram 16g - GPU GTX 1050 |
Beta Was this translation helpful? Give feedback.
-
For now, so I can continue testing, is there any way for me to access the prompt through vscode? |
Beta Was this translation helpful? Give feedback.
-
I managed to run as a developer, and accessed the prompt inside the electron folder. I changed it and it's working very well for my language now. |
Beta Was this translation helpful? Give feedback.
-
LLm is now much more in tune with my purpose and responding in Portuguese. |
Beta Was this translation helpful? Give feedback.
-
@Arcovoltaire interesting this seems to make a big difference! I'm working on making this customisable this week in the app. Our interfaces overall to the LLMs are not the best and there are lots of places to make improvements. I'll keep you posted :) |
Beta Was this translation helpful? Give feedback.
-
I would like to sincerely ask you to provide a way to change the prompt sent with the context, so that I can say in which language the LLM should respond and in what tone, and commands similar to that.
Beta Was this translation helpful? Give feedback.
All reactions