Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Update DOFA configuration with training parameters and learning rate …
…adjustments - Added gradient clipping and gradient accumulation settings for improved training stability. - Reduced maximum training epochs from 150 to 50 and adjusted early stopping patience from 20 to 10 epochs. - Updated learning rate from 0.001 to 2e-5 and added cooldown and minimum learning rate settings for the scheduler.
- Loading branch information