-
Notifications
You must be signed in to change notification settings - Fork 199
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix network loading #1533
Fix network loading #1533
Conversation
Signed-off-by: Matthias Hadlich <[email protected]>
Signed-off-by: Matthias Hadlich <[email protected]>
Signed-off-by: Matthias Hadlich <[email protected]>
for more information, see https://pre-commit.ci
Signed-off-by: Matthias Hadlich <[email protected]>
Signed-off-by: Matthias Hadlich <[email protected]>
@tangy5 Can you look into this? The way it is some models will probably break in the dev branch since we changed the default value of |
this is PR now? and why so many changes again? some rebase issue on your branch? |
Fix the order for load_strict Signed-off-by: SACHIDANAND ALLE <[email protected]>
Signed-off-by: Sachidanand Alle <[email protected]>
Hi @SachidanandAlle, sorry for the confusion. In the other commit I made the error of not fully completing everything, since I wanted to know if the idea is fine for everyone. Therefore I changed the default value in basic_infer and the value for Deepgrow just to show how it would look like. I wanted to add the other inferers afterwards and was waiting for your confirmation, so that I could proceed. However you instantly merged it after the discussion apparently thinking all the work is already done. This is the final PR, I changed it for all the basic_infer based models I found, I hope that was all of them. About the rebase - yeah this is the code state of the last branch, I did not update the main branch meanwhile. I hope that is fine otherwise I can merge in main, then it should only show the most recent commit Once again, sorry for the misunderstanding! |
Hey no worries.. I see what was the concern.. by default, we don't load it strict. and hence people were finding difficult to debug why inference results were bad.. (any load failure will have random weights) so i was ok to make it strict by default.. and override false in each model for people to see it/understand it.. and thanks for the contributions. please continue helping. |
This is the missing code for #1521, where I missunderstood @SachidanandAlle.
The code sets the default value
load_strict=False
ofBasicInferTask
in the derived subclasses. TheBundleInferTask
also now matches the previous behavior.The only model with a value of True is
HovernetNuclei
.This PR fixes the probably breaking network whenever values are missing, since the other PR enabled just that.