-
Notifications
You must be signed in to change notification settings - Fork 75
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is there support for function calls tokens #39
Comments
I'm sorry, open ai does not provide any explanation on how to calculate function call costs, so we currently do not know how to calculate it either. |
The Java implementation of Tiktoken seems to have a util for calculating tokens for functions. I have no idea where they got this solution from however. |
I tried a few simple examples following the calculation method in this code snippet, but the computed values were inconsistent with the usage amount returned by the API. The following examples can serve as evidence: In this example, API response prompt_tokens usage is 43 In this example, API response prompt_tokens usage is 35. The only difference between the above two examples is that the type of test_string changed from string to object, but the consumption decreased by 8. The number of tokens for the literal "string" and "object" should both be 1, and should not lead to a discrepancy. The token consumption appears to need to be calculated in combination with the validity of the schema. Without OpenAI disclosing the rules, it is really difficult to infer through experimentation. |
However, there are at least some basic rules that are certain. We can provide an approximate calculation method to calculate usage amount when invoking the API. |
Thanks for the project. Great so far.
Is there support for counting tokens when using function calls?
https://platform.openai.com/docs/guides/gpt/function-calling
The text was updated successfully, but these errors were encountered: