You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Using transformers==4.92.2, I get this warning when running recover_model_weights.py:
WARNING:root:Your base LLaMA checkpoint is converted with transformers==4.27.0.dev0, but transformers>=4.29.2 is expected. This may produce a corrupted checkpoint and lead to unexpected behavior. Please regenerate your base LLaMA checkpoint with transformers>=4.29.2.
The text was updated successfully, but these errors were encountered:
lolipopshock
pushed a commit
to lolipopshock/alpaca_farm
that referenced
this issue
Sep 24, 2023
[ continuation of #70 ]
Using transformers==4.92.2, I get this warning when running
recover_model_weights.py
:The text was updated successfully, but these errors were encountered: