We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
1,利用cpp进行加速,如llama.cpp那样,你们也可以搞个mixtral.cpp,支持mixtral-8x7b和mixtral-7b在f32,f16等精度上的灵活切换 2,全参数的训练、提示学习微调代码,及其对应的数据json格式
The text was updated successfully, but these errors were encountered:
Good suggestion. For 2, our team has supported finetuning. Welcome to https://github.com/InternLM/xtuner/tree/main/xtuner/configs/mixtral for more information For 1. stay tuned.
Thanks.
Sorry, something went wrong.
从0开始训练呢,我想修改dim,hidden_dim以及vocab_size等等,是否可提供一个train.py
顺带说明数据json文件中的示例,似应支持的格式:有监督---qa对和多轮对话,无监督---长文档
No branches or pull requests
1,利用cpp进行加速,如llama.cpp那样,你们也可以搞个mixtral.cpp,支持mixtral-8x7b和mixtral-7b在f32,f16等精度上的灵活切换
2,全参数的训练、提示学习微调代码,及其对应的数据json格式
The text was updated successfully, but these errors were encountered: