-
Notifications
You must be signed in to change notification settings - Fork 71
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About stable block #20
Comments
Hi, thanks for your interest in our work.
The code for the scaling weights is here:
If the scaling weights are applied, class-incremental-learning/adaptive-aggregation-networks/trainer/base_trainer.py Lines 313 to 316 in 834de46
|
Thanks,what about the fc weight in stabel block?In |
For the FC classifier, we directly follow LUCIR. |
In your paper, your said you apply a small set of scaling weights in stable block .But i can't find the corresponding code.Can you tell me where to find it?
Also,i read the code for optimizer in
base_trainer.py
.In function
set_optimizer
,the parameters forb2_model
is learnable if the 2nd branch is not fixed.But the parameters forb1_model
, the FC weights for old classes is freezed and the others is all put into the optimizer.Why? What about the scaling weights? If we optimize the parameters forb1_model
just asb2_model
,how can it be called stable block?(Although the FC weights for old classes is freezed)The text was updated successfully, but these errors were encountered: