Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ignore any warnings LiteLLM might emit on import #38

Merged
merged 1 commit into from
Apr 8, 2024

Conversation

nicovank
Copy link
Collaborator

@nicovank nicovank commented Apr 8, 2024

If you run ChatDBG with a recent version of LiteLLM, you'll get a few warnings:

/home/vscode/.local/lib/python3.10/site-packages/pydantic/_internal/_fields.py:149: UserWarning: Field "model_list" has conflict with protected namespace "model_".

You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
  warnings.warn(
/home/vscode/.local/lib/python3.10/site-packages/pydantic/_internal/_fields.py:149: UserWarning: Field "model_name" has conflict with protected namespace "model_".

You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
  warnings.warn(
/home/vscode/.local/lib/python3.10/site-packages/pydantic/_internal/_fields.py:149: UserWarning: Field "model_group_alias" has conflict with protected namespace "model_".

You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
  warnings.warn(
/home/vscode/.local/lib/python3.10/site-packages/pydantic/_internal/_fields.py:149: UserWarning: Field "model_info" has conflict with protected namespace "model_".

You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
  warnings.warn(
/home/vscode/.local/lib/python3.10/site-packages/pydantic/_internal/_fields.py:149: UserWarning: Field "model_id" has conflict with protected namespace "model_".

You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
  warnings.warn(

This has already been raised here: BerriAI/litellm#2832.
In the meantime and/or until LiteLLM releases are a bit more stable, maybe we can just ignore warnings so ChatDBG users don't think it comes from us. An alternative solution would be to be more aggressive about pinning LiteLLM versions.

@stephenfreund stephenfreund merged commit 646d31e into main Apr 8, 2024
2 checks passed
@nicovank nicovank deleted the ignore-litellm-warnings branch April 8, 2024 20:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants