Trying to get ChatHuggingFace
working with with_structured_output
. Keep getting "Tool with name any not found". No tools are defined so why?
#25675
Replies: 3 comments 1 reply
-
Hey there, @michael-newsrx! I'm here to help you with any bugs, questions, or contributions you might have. Feel free to ask for assistance while waiting for a human maintainer. I found a similar issue that might help you resolve the "Tool with name any not found" error when using To resolve this error, ensure that you are using a chat model that supports tool calling. The error suggests that the model you are using does not support the required tool calling functionality. Here is an example of how to use a supported model, such as from langchain_core.pydantic_v1 import BaseModel, Field
from langchain_huggingface import ChatHuggingFace, HuggingFaceEndpoint
class ArticleAnalysis(BaseModel):
entities: list[str] = Field(description="Most important entities and concepts in the article")
topic: str = Field(description="General topic of the article")
source_type: str = Field(description="Type of source article")
audience: str = Field(description="The target audience for the source article")
taxonomy: str = Field(description="The taxonomy assignment based on the subject material")
subheading: str = Field(description="A one or two word topic subheading that matches the taxonomy assignment")
outline: list[str] = Field(description="A succinct cliff notes outline summary in a list format")
quality: str = Field(description="Quality of the source article", examples=["High", "Good", "Poor", "Bad"], pattern="High|Good|Poor|Bad")
# Ensure you are using a chat model that supports tools.
llm = HuggingFaceEndpoint(
repo_id="microsoft/Phi-3-mini-4k-instruct",
task="text-generation",
max_new_tokens=512,
do_sample=False,
repetition_penalty=1.03,
)
model = ChatHuggingFace(llm=llm, verbose=True)
structured_llm = model.with_structured_output(ArticleAnalysis)
output_msg = structured_llm.invoke("Analyze the following article: ...")
print(output_msg) Make sure to reference the documentation to see an up-to-date list of which models support Additionally, if you are implementing a custom chat model, ensure that it inherits from For example, the def bind_tools(
self,
tools: Sequence[Union[Dict[str, Any], Type[BaseModel], Callable, BaseTool]],
*,
tool_choice: Optional[
Union[dict, str, Literal["auto", "none", "required", "any"], bool]
] = None,
**kwargs: Any,
) -> Runnable[LanguageModelInput, BaseMessage]:
formatted_tools = [convert_to_openai_tool(tool)["function"] for tool in tools]
formatted_tool_choice = "required" if tool_choice == "any" else tool_choice
return super().bind(
available_tools=formatted_tools, tool_choice=formatted_tool_choice, **kwargs
) Ensure your custom chat model has a similar implementation to support tool calling [2][3]. |
Beta Was this translation helpful? Give feedback.
-
+1 facing same issue |
Beta Was this translation helpful? Give feedback.
-
+1 facing same issue as well |
Beta Was this translation helpful? Give feedback.
-
Checked other resources
Commit to Help
Example Code
ERROR RECEIVED
Tool error: Tool with name any not found
JSON Body Sent
Beta Was this translation helpful? Give feedback.
All reactions