OpenAI client libraries with GitHub Models #146699
Replies: 2 comments 1 reply
-
Issue: API version required when using structured outputs with GH models I’m using the Python OpenAI library with GH models and found that it requires API version when making calls that use structured outputs. Example code and error below: class Step(BaseModel):
explanation: str
output: str
class MathResponse(BaseModel):
steps: List[Step]
final_answer: str
client = openai.OpenAI(
base_url="https://models.inference.ai.azure.com/",
api_key=os.environ["GH_TOKEN"],
)
completion = client.beta.chat.completions.parse(
messages=[
{"role": "user", "content": "solve 8x + 31 = 2"},
],
response_format=MathResponse,
model="gpt-4o"
) Error:
I can workaround this by passing in the API version as a default query parameter in the client, but I shouldn't have to do this: default_query = {"api-version": "2024-08-01-preview"}
client = openai.OpenAI(
base_url="https://models.inference.ai.azure.com",
api_key=os.environ["GITHUB_TOKEN"],
default_query=default_query
) |
Beta Was this translation helpful? Give feedback.
-
How i can i do because i dont have a brain for my issues now |
Beta Was this translation helpful? Give feedback.
-
Select Topic Area
Question
Body
Creating this discussion to track issues around using the official OpenAI client libraries for Python, NET/C#, JavaScript/TypeScript, etc with GitHub AI models
Beta Was this translation helpful? Give feedback.
All reactions