Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Podcastfy not working with local model #215

Open
Davide-gtr opened this issue Dec 20, 2024 · 2 comments
Open

Podcastfy not working with local model #215

Davide-gtr opened this issue Dec 20, 2024 · 2 comments

Comments

@Davide-gtr
Copy link

Hello, when I try to use Podcastfy with local model (TinyLlama for example), it don't generate the right formatting, and sometimes it don't generate anything.

I don't have this issue with OpenAI model (GPT-4o), so can it be related to the context window ? I know TinyLlama has only 2048 tokens context window.

Or is it something else ?

Thank you in advance for your help and for your tool.

@souzatharsis
Copy link
Owner

souzatharsis commented Dec 20, 2024 via email

@Davide-gtr
Copy link
Author

Hi,
Yes I understand the limits of local models, I also tested with Llama 3.1 model but I have the same issue.
I will definitely take a look at the outlines library, maybe I can enhance the formatting.
Thank's a lot for the answers!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants