Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Help me pls #168

Open
aritralegndery opened this issue Mar 17, 2024 · 3 comments
Open

Help me pls #168

aritralegndery opened this issue Mar 17, 2024 · 3 comments

Comments

@aritralegndery
Copy link

Traceback (most recent call last): File "e:\llm\TinyLlama\pretrain\tinyllama.py", line 17, in <module> from lit_gpt.model import GPT, Block, Config, CausalSelfAttention File "E:\llm\TinyLlama\lit_gpt\__init__.py", line 1, in <module> from lit_gpt.model import GPT File "E:\llm\TinyLlama\lit_gpt\model.py", line 13, in <module> from flash_attn import flash_attn_func ModuleNotFoundError: No module named 'flash_attn'

@jzhang38
Copy link
Owner

Follow the instructions at https://github.com/Dao-AILab/flash-attention to install flash attn.

@aritralegndery
Copy link
Author

Follow the instructions at https://github.com/Dao-AILab/flash-attention to install flash attn.

Thank You Sir.

@lixali
Copy link

lixali commented Nov 10, 2024

This is taking me a very long time to install. 6 hours now and it is still not finished. Is it normal?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants