Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ERROR: LiteLLM call failed: litellm.APIError: APIError: GroqException - Connection error. #1624

Open
SujayC66 opened this issue Nov 19, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@SujayC66
Copy link

Description

i have created project using command

crewai create crew project folder name

After editing script as per my requirements when Im runniung the code using crewai run Im getting above error

I have use chatgroq from langchain_groq as well crewai class LLM.
Any lead will be appreciated

Steps to Reproduce

No

Expected behavior

markdown report

Screenshots/Code snippets

@crewbase
class QuestionCrew():
"""KPIs questions discovery crew"""

agents_config = 'config/questions/agents.yaml'
tasks_config = 'config/questions/tasks.yaml'

# @llm
# def llm_model(self):
# 	return ChatGroq(model="groq/llama-3.1-70b-versatile",
# 		 	   api_key=os.environ.get("GROQ_API_KEY"),
# 			   max_tokens=8000,
# 			   temperature=0,
# 			   )

# @llm
# def llm_model(self):
# 	return openai.OpenAI(
# 						base_url="https://api.groq.com/openai/v1",
# 						api_key=os.environ.get("GROQ_API_KEY"),
# 						)

@llm
def llm_model(self):
	return LLM(model="groq/llama-3.1-70b-versatile",
		 	   api_key="",
			   max_tokens=8000,
			   temperature=0
			   )


@agent
def business_analyst(self) -> Agent:
	return Agent(
		config=self.agents_config['business_analyst'],
		max_rpm=None,
		verbose=True,
		# llm= LLM(model="groq/llama-3.1-70b-versatile",
		#  	   api_key="",
		# 	   max_tokens=8000,
		# 	   temperature=0,
		# 	   )
	)

Operating System

Windows 11

Python Version

3.11

crewAI Version

0.80.0

crewAI Tools Version

0.80.0

Virtual Environment

Venv

Evidence

ERROR: LiteLLM call failed: litellm.APIError: APIError: GroqException - Connection error.

Possible Solution

NA

Additional context

I tried through multiple ways

@SujayC66 SujayC66 added the bug Something isn't working label Nov 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant