Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Q: can Ollama generate longer summaries (based on word length or chapter)? #21

Open
BradKML opened this issue Mar 21, 2024 · 0 comments

Comments

@BradKML
Copy link

BradKML commented Mar 21, 2024

Currently I am testing with captured blogs with a lot of lines and paragraphs to see how they summarize. However most of the time Ollama (with Qwen for large context window and small footprint) they only do one-line summaries. Even asking question with page context does not yield results of the same length, often Q&A leads to multi-line answers.
Are there ways to set the maximum or ideal length of the output? Alternatively, are there ways to "chunk" the page into individual chunks for Ollama to treat as individual inputs (semantic chunking)?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant