Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] #1617

Open
luisandino opened this issue Nov 17, 2024 · 0 comments
Open

[BUG] #1617

luisandino opened this issue Nov 17, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@luisandino
Copy link

Description

I am trying to connect bedrock to crewai, but even though the model id is correct and starts configuring the agents correctly, whenever an agent tries to connect to bedrock, it gets a 400 bad request error due to something done internaly by liteLLM.

Steps to Reproduce

  1. Set a custom LLM class for the agents
  2. Provide the model_id
  3. Set up the agents, manager agent, tasks, and crew
  4. use the kick-off method and wait for the output

Expected behavior

User receives an output according to the provided question

Actual behaviour: receives a 400 error message given by liteLLM. and the crew reaches the max_rpm threshold and further waits a minute to keep trying and stays in the same cycle unless I interrupt it.

Screenshots/Code snippets

image

Operating System

Other (specify in additional context)

Python Version

3.10

crewAI Version

0.80.0

crewAI Tools Version

0.14.0

Virtual Environment

Venv

Evidence

I am using the following snippet to instantiate the crewai LLM class:

LLM(
model= chat_provider + "/" + chat_model,
temperature=chat_temperature,
)

where chat_provider = "bedrock" and chat_model = "meta.llama3-1-70b-instruct-v1:0". According to the YAML files configuration, the string would be something like:

llm: bedrock/anthropic.claude-3-sonnet-20240229-v1:0

Possible Solution

adapt the necessary code when calling the boto3 client internally to pass the user_message received as a question from the crew.kick_off method.

Additional context

Linux pop-os 6.9.3-76060903-generic #202405300957172676603522.04~4092a0e SMP PREEMPT_DYNAMIC Thu S x86_64 x86_64 x86_64 GNU/Linux

@luisandino luisandino added the bug Something isn't working label Nov 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant