You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I used came-optimizer to pretrain llm with zero stage 1 and met a untested-warnning given by zero.
It seems only optimizers in support-list are tested to work correctly.
From my understanding, Zero Stage-1 is a wrapper for optimizer to part and gather optimizer status, it should have no relation with specific optimizer type, do I misunderstand it? does it need to be modified for compatibility?
ifzero_enabled:
ifnotis_zero_supported_optimizer(basic_optimizer):
assert (
self.zero_allow_untested_optimizer()
), 'You are using an untested ZeRO Optimizer. Please add <"zero_allow_untested_optimizer": true> in the configuration file to use it.'ifself.global_rank==0:
logger.warning("**** You are using ZeRO with an untested optimizer, proceed with caution *****")
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi, I used came-optimizer to pretrain llm with zero stage 1 and met a untested-warnning given by zero.
It seems only optimizers in support-list are tested to work correctly.
From my understanding, Zero Stage-1 is a wrapper for optimizer to part and gather optimizer status, it should have no relation with specific optimizer type, do I misunderstand it? does it need to be modified for compatibility?
Beta Was this translation helpful? Give feedback.
All reactions