You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
in my work , I already train 48 epochs before, and I want to use the epoch_48.pth to continue my work to 56 epochs. but when I change param_scheduler in config, I see the config lr did not change. what problem is it ? How can i solve it
In epoch 48, the learning rate was 1e-5, but the configuration was changed and the training continued, and the rest of the epoch was still 1e-5, with no change.
Suggest a potential alternative/fix
No response
The text was updated successfully, but these errors were encountered:
How do I make sure that when I resume training, I can reset the learning rate scheduler to take effect with the new configuration?
zhangchao-s
changed the title
[Docs] after load_checkpoint, change param_scheduler is no effective
[Docs] after resume training, change param_scheduler is no effective
Jul 25, 2024
📚 The doc issue
in my work , I already train 48 epochs before, and I want to use the epoch_48.pth to continue my work to 56 epochs. but when I change param_scheduler in config, I see the config lr did not change. what problem is it ? How can i solve it
before change config:
param_scheduler= [
dict(
type='LinearLR',
start_factor=1.0 / 3,
by_epoch=False,
begin=0,
end=500),
dict(
type='MultiStepLR',
begin=0,
end=48,
by_epoch=True,
milestones=[32, 44],
gamma=0.1),
]
after change config(I want):
param_scheduler= [
dict(
type='LinearLR',
start_factor=1.0 / 3,
by_epoch=False,
begin=0,
end=500),
dict(
type='MultiStepLR',
begin=0,
end=56,
by_epoch=True,
milestones=[32, 44, 56],
gamma=0.1),
]
In epoch 48, the learning rate was 1e-5, but the configuration was changed and the training continued, and the rest of the epoch was still 1e-5, with no change.
Suggest a potential alternative/fix
No response
The text was updated successfully, but these errors were encountered: