-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add OCI Generative AI tool calling support #16888
Add OCI Generative AI tool calling support #16888
Conversation
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
@@ -219,6 +190,7 @@ def _get_all_kwargs(self, **kwargs: Any) -> Dict[str, Any]: | |||
**kwargs, | |||
} | |||
|
|||
@deprecated("Deprecated in favor of `chat`, which should be used instead.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
technically, you can make complete call chat -- we have utils to make this very easy. A few llms do this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you @logan-markewich Agreed. But basically all the models that support completion have been deprecated in our service. Let me take a look at other vendors.
llama-index-integrations/llms/llama-index-llms-oci-genai/llama_index/llms/oci_genai/base.py
Show resolved
Hide resolved
**chat_params, | ||
} | ||
|
||
def chat_with_tools( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Technically you shouldn't need to implement chat_with_tools or stream_chat_with_tools -- the idea is that chat and stream chat already handle tools being passed in/out, and all you need to implement is _prepare_chat_with_tools()
and get_tool_calls_from_response()
and maybe _validate_chat_with_tools_response()
Check out anthropic or Ollama for a decent example of this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @logan-markewich , I have updated chat and stream_chat and also removed chat_with_tools/stream_chat_with_tools in our implementation. We tested both chat/stream_chat with and without tools, which worked fine. Thanks for the suggestions!
@logan-markewich Hi, we have addressed all your comments. Please take another look when you have a moment and let us know if there are any remaining issues. We’re eager to get this merged, as our customers are eagerly awaiting this feature. |
@logan-markewich Hi, could you kindly take a look at the PR when you've got time? |
Description
Adding tool calling support for chat models offered by OCI Generative AI services.
This is an update to our last PR #14080.
Fixes # (issue)
N/A
New Package?
No
Did I fill in the
tool.llamahub
section in thepyproject.toml
and provide a detailed README.md for my new integration or package?Version Bump?
Did I bump the version in the
pyproject.toml
file of the package I am updating? (Except for thellama-index-core
package)Type of Change
How Has This Been Tested?
Suggested Checklist:
make format; make lint
to appease the lint gods