-
Notifications
You must be signed in to change notification settings - Fork 162
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for function calling handling #76
Comments
Maybe I'm lacking imagination -- could you provide an example of the kinds of things you could do in Emacs with this? (Keeping in mind that this requires sending openAI the full type signatures of all the available functions.) |
The main use case I have in mind is sort of cleaner manipulation of existing buffer content. Right now, it can occasionally be kind of difficult to get chat gpts response to be something that is just a code completion, even if you prompt it with stuff like: "You are a large language model and a careful programmer. Provide code that completes what is provided and only code as output without any additional text, prompt or note." it still does not always only give the completion. Furthermore, it could do pretty interesting stuff on larger segments of code, given the proper api, that could avoid it having to send back the entire text. It might be possible to do something like provide it with editing commands that allow it to make edits to specific parts of a buffer or a selected region. Some thinking and testing around what exactly the api should look like here. Examples of what the API could look like:
This is something that I think will become increasingly important as the context window is made larger and it can operate over larger and large segments of code bases. As the context window increases, we could even consider sending over entire projects and then having the edit api specify edits in specific files. |
IIUC, you're describing the case of supplying ChatGPT the descriptions and type signatures of elisp functions, and then asking it to combine those (along with its general knowledge of elisp) to make edits to the buffer. I'm still unable to think of an actual use case in this context, since it already understands the elisp API primitives quite well. The fact that buffer editing elisp functions have to work by side-effect doesn't help either. Can you think of a more specific example like the weather report function in the documentation? |
Nope. Did you read the text I wrote above? I don't think there would be any expectation that it directly call elisp arbitrary elisp functions. You would probably give it a relatively limited API that would simply allow it to specify edits to existing code: As an example, you might give chat gpt functions like: delete(start_line: int, start_character: int, end_line: int, end_character: int) You would then handle these requests from chat gpt internally with elisp code. The advantage would be that: a) You could differentiate between cases where chat gpt wants to simply respond to the user, perhaps to ask for more information, or give some exposition, and cases where chat gpt wants to make edits to the code that was provided In the current status quo, you can SOMETIMES get chat gpt to complete things, if you prompt it just right, but often it will give extra exposition that you have to delete or otherwise modify. Also, you're never going to get it to make inline edits to a function to change things working through gptel. With my suggestion, it seems quite plausible to me that you could have it perform complicated transformations of an input text. |
You can't think of an actual use case? As an example, here's an example of a problem that gpt-4 can easily one shot: https://chat.openai.com/share/e3a05c2e-7dd4-4e97-bc3d-fd6decd49285 In an ideal world, gptel could handle making the necessary edits, and I think using the functions api is a good way to do this. If nothing else, v1 version of this could literally only allow complete replacements of the existing text. It would still be useful for it to be able to specify: Here is the start and end of the existing text that this new text should replace, and the functions API seems like a natural way to be able to exchange that information. |
Hmmm that API is basically exposing another argument (like "model name" or "temperature") in which one can optionally convey an array of "function declarations (with optional docstrings)". When i saw this post the thought i had was: maybe one wants to refactor a very large function or piece of code, currently that requires getting the full thing rewritten to then do an ediff (possible from the refactor transient) whereas it might be the case that the actual diff needed is tiny and perhaps this way GPT can convey it better via some funcalls and perhaps one could execute it straight from the reply to effect the change....but with @IvanMalison latest reply it seems one doesn't have to actually give it actual elisp functions but instead a small set of simple "made up" text manip primitives which one can handle in the callback with actual elisp? (safer and potentially more promising) (Although TBF, in the above case i would, in a gptel buffer, just ask it to reply with only a diff of what to change which id then try to apply by using the diff-mode bindings to "apply" which might be very easy if one uses since gptel supports custom callbacks via the lower level function gptel-request that API can already be experimented with? The only (minor) missing piece would be a way to set that parameter in the curl...to tinker with the idea one can just defadvice override the curl args function to additionally hardcode in a "functions" argument? Then it'd be interesting to see some real use cases demoed via |
@PalaceChan do you have API access? I haven't gotten it yet and I've been on the wait-list for months. I would totally start hacking on this if I did. |
@IvanMalison i am also on the wait-list but that only applies to gpt4, not to gpt-3.5-turbo-0613 run this command to see which models you have
in my case i dont see any of the then try this command, for example:
(the model replied saying it cannot do that because it doesnt have weather check access) now try running this
this time the response I got used the function API, trimming the fat out from my
(very very minor side-note unrelated to the functions API but @karthink i remember reading that Here is a simple script i was using to explore the suggestion, following their example guide
The first time through I did not pass any
Then I passed it the
Probably have to play around with this some more... |
ah very cool, nice work. Perhaps we shuold start with an even simpler API that doesn't have two functions. That said, the fact that this is a 3.5 variant doesn't give me tons of hope. |
@IvanMalison even though i'm personally still on the waitlist I was able to get help running this with gpt-4-32k-0613. For the original example it still replied with the "delete_text" only. Then i changed the example to a buggy python function
running that example with the two-function API and gpt4-32k-0613 and asking it to fix the bug called i then changed the functions array to have a single function which is the same as the this time it replied by calling the region seems wrong as it clips the for loop part in the middle but slightly cooler to see it trying.. nevertheless still much less "useful" than when i simply asked it that in the chat buffer and asked it to format its reply in the form of a diff hunk, here one gets a lovely org src block in |
I put together a proof of concept for this: #209 |
See https://platform.openai.com/docs/guides/gpt/function-calling
would be cool to somehow add some support for this. Some inversion of control style kind of thing where you could set up the system prompt to be able to call emacs functions or something like that might be cool.
The text was updated successfully, but these errors were encountered: