Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update boto3 dependency to version 1.28.57, refactor bedrock client initialization and remove troubleshooting guide from documentation. #497

Merged
merged 2 commits into from
Sep 30, 2023

Conversation

coconut49
Copy link
Contributor

Boto3 version 1.28.57, updated yesterday, now officially supports Bedrock, so there is no need to install a special version of Boto3.

…nitialization and remove troubleshooting guide from documentation.
@vercel
Copy link

vercel bot commented Sep 30, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Sep 30, 2023 5:25am

Comment on lines +24 to +28
client = boto3.client(
service_name="bedrock-runtime",
region_name=region_name,
endpoint_url=f'https://bedrock-runtime.{region_name}.amazonaws.com'
)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The invoke_model method only exists in the bedrock-runtime, see the boto/boto3#3881

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if possible can you run the bedrock tests in test_completion.py with these changes for bedrock and send a screenshot that they work.

If not I can test them tomorrow

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Titan test fails because I don't have access, but the rest pass. Please refer to the screenshots. Thank you for your prompt assistance!
image

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks awesome - what is that tool you're using to run pytest. looks super cool

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

😊 It's PyCharm with the following config:
image

@ishaan-jaff
Copy link
Contributor

@coconut49 thanks for this awesome PR !! Really appreciate it !

Copy link
Contributor

@ishaan-jaff ishaan-jaff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good to me - can we just not add boto3 as a depedancy for litellm

pyproject.toml Outdated
@@ -14,6 +14,7 @@ tiktoken = ">=0.4.0"
importlib-metadata = ">=6.8.0"
tokenizers = "*"
click = "*"
boto3 = ">=1.28.57"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we not add boto3 as a dependancy to litellm

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, for that I struggle too, I've moved it out of dependency. Thanks.

Comment on lines +24 to +28
client = boto3.client(
service_name="bedrock-runtime",
region_name=region_name,
endpoint_url=f'https://bedrock-runtime.{region_name}.amazonaws.com'
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if possible can you run the bedrock tests in test_completion.py with these changes for bedrock and send a screenshot that they work.

If not I can test them tomorrow

Copy link
Contributor

@ishaan-jaff ishaan-jaff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@ishaan-jaff
Copy link
Contributor

@coconut49 what's your twitter / linkedin we'd love to give you a shoutout

@ishaan-jaff ishaan-jaff merged commit 202d57f into BerriAI:main Sep 30, 2023
@krrishdholakia
Copy link
Contributor

@coconut49 bump on this? your work is awesome, we'd love to celebrate it!

@coconut49
Copy link
Contributor Author

I am pleased to offer my code contributions. In fact, I have always anticipated the creation of a project that would provide a standardized interface for the various LLM API implementations. With such a project like LiteLLM in the works, developers need not worry about the disparities between the LLM APIs. Thank you for initiating this project and for all your efforts towards it.

I'm @OrangeCat59 on Twitter and Jie on LinkedIn!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants