-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add solar pro llm and document parser #16099
Conversation
@@ -207,3 +231,50 @@ def get_num_tokens_from_message(self, messages: Sequence[ChatMessage]) -> int: | |||
) | |||
num_tokens += tokens_suffix | |||
return num_tokens | |||
|
|||
@llm_retry_decorator | |||
def _chat(self, messages: Sequence[ChatMessage], **kwargs: Any) -> ChatResponse: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
curious why we subclass _chat, but not any other method like _stream_chat or _achat, etc.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess you only support parse documents with .chat()
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a mistake in following llamaindex interface. Added some more methods.
for i, doc in enumerate(docs): | ||
file_title = file_titles[min(i, len(file_titles) - 1)] | ||
document_contents += f"{file_title}:\n{doc.text}\n\n" | ||
print("DOCUMENT CONTENTS", document_contents) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lets remove this print
tokenizers = "^0.19.1" | ||
llama-index-core = "^0.11.0" | ||
llama-index-llms-openai = "^0.2.0" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we need to add the upstage reader as a dependency now right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(should also bump the version of the integration package here)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Or, maybe we need to publish the reader first actually
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah you are right. we should publish reader first.
we need to add the upstage reader as a dependency now right?
yes, we need to add it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no need to check this in
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no need to check this in
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, just a few things to clean up
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
Description
Add new llm model and readers api of upstage.
New llm model,
solar-pro
can be found here https://developers.upstage.ai/docs/apis/chat .New document parse model
Document Parse
can be found here https://developers.upstage.ai/docs/apis/document-parse .New Package?
Did I fill in the
tool.llamahub
section in thepyproject.toml
and provide a detailed README.md for my new integration or package?Version Bump?
Did I bump the version in the
pyproject.toml
file of the package I am updating? (Except for thellama-index-core
package)Type of Change
Please delete options that are not relevant.
How Has This Been Tested?
Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration
Suggested Checklist:
make format; make lint
to appease the lint gods