-
Notifications
You must be signed in to change notification settings - Fork 17
Issues: aidatatools/ollama-benchmark
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
cant detect intel arc a770 ubuntu 22.04
Investigation
Investigate user's questions
#17
opened Sep 11, 2024 by
Xyz00777
LLM Benchmark crashes on llava:13b when used on Nvidia GPU
Investigation
Investigate user's questions
#15
opened Aug 19, 2024 by
synchronic1
[Feature Request] Pull model through Ollama API instead of invoking ollama binary
#13
opened Jul 4, 2024 by
yeahdongcn
Adding CPU/GPU distribution to the logs and reports
enhancement
New feature or request
#11
opened Jun 11, 2024 by
dan-and
Running on Non GPU laptops
Investigation
Investigate user's questions
#8
opened May 23, 2024 by
twelsh37
ProTip!
Find all open issues with in progress development work with linked:pr.