Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Suport for Moondream 0.5B and int8 models #18

Open
geoffroy-noel-ddh opened this issue Dec 9, 2024 · 0 comments
Open

Suport for Moondream 0.5B and int8 models #18

geoffroy-noel-ddh opened this issue Dec 9, 2024 · 0 comments
Assignees

Comments

@geoffroy-noel-ddh
Copy link
Member

In Dec 24, Moondream has released two int8 models for CPU. A 2B and 0.5B. They should be faster on CPU, possibly allowing bvqa to efficiently work in a gitlab pipeline or github workflow (e.g. to caption images).

Note that the API for those is slightly different than for the ones hosted on HF.

@geoffroy-noel-ddh geoffroy-noel-ddh self-assigned this Dec 9, 2024
@geoffroy-noel-ddh geoffroy-noel-ddh changed the title Suport of Moondream 0.5B and int8 models Suport for Moondream 0.5B and int8 models Dec 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant