Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix network loading #1533

Merged
merged 11 commits into from
Aug 31, 2023
Merged

Conversation

matt3o
Copy link
Contributor

@matt3o matt3o commented Aug 27, 2023

This is the missing code for #1521, where I missunderstood @SachidanandAlle.

The code sets the default value load_strict=False of BasicInferTask in the derived subclasses. The BundleInferTask also now matches the previous behavior.

The only model with a value of True is HovernetNuclei .

This PR fixes the probably breaking network whenever values are missing, since the other PR enabled just that.

@matt3o
Copy link
Contributor Author

matt3o commented Aug 30, 2023

@tangy5 Can you look into this? The way it is some models will probably break in the dev branch since we changed the default value of load_strict to True

@SachidanandAlle
Copy link
Collaborator

this is PR now? and why so many changes again? some rebase issue on your branch?

@SachidanandAlle SachidanandAlle marked this pull request as ready for review August 31, 2023 15:45
@SachidanandAlle SachidanandAlle enabled auto-merge (squash) August 31, 2023 15:45
Fix the order for load_strict

Signed-off-by: SACHIDANAND ALLE <[email protected]>
Signed-off-by: Sachidanand Alle <[email protected]>
@SachidanandAlle SachidanandAlle merged commit c90f42c into Project-MONAI:main Aug 31, 2023
14 checks passed
@matt3o
Copy link
Contributor Author

matt3o commented Aug 31, 2023

Hi @SachidanandAlle,

sorry for the confusion. In the other commit I made the error of not fully completing everything, since I wanted to know if the idea is fine for everyone. Therefore I changed the default value in basic_infer and the value for Deepgrow just to show how it would look like. I wanted to add the other inferers afterwards and was waiting for your confirmation, so that I could proceed. However you instantly merged it after the discussion apparently thinking all the work is already done.
Your comments were absolutely right, the way the PR was, it would probably break older models, which is why I opened this second PR to fix that.

This is the final PR, I changed it for all the basic_infer based models I found, I hope that was all of them. About the rebase - yeah this is the code state of the last branch, I did not update the main branch meanwhile. I hope that is fine otherwise I can merge in main, then it should only show the most recent commit

Once again, sorry for the misunderstanding!

@SachidanandAlle
Copy link
Collaborator

Hey no worries.. I see what was the concern.. by default, we don't load it strict. and hence people were finding difficult to debug why inference results were bad.. (any load failure will have random weights)

so i was ok to make it strict by default.. and override false in each model for people to see it/understand it..

and thanks for the contributions. please continue helping.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants