Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running program.populate during training causes loss to be None for PrimalDual #342

Open
AlexWan0 opened this issue May 31, 2022 · 2 comments
Assignees

Comments

@AlexWan0
Copy link
Collaborator

Issue: Using program.populate in the middle of a training iteration results in the loss being None on the next training epoch.

Seems to be because program.populate sets the mode as Mode.POPULATE. The model should set it back on the beginning of training, but PrimalDual program uses self.train(), which only changes the pytorch setting.

Changing line 100 of primaldualprogram.py to self.model.mode(Mode.TRAIN) fixes this.

@hfaghihi15
Copy link
Collaborator

Hi @AlexWan0
I have pushed The latest commit into the develop_newLC branch, would you check and see if this resolves your problem?

Commit : 651c7dd

@hfaghihi15
Copy link
Collaborator

Hi @AlexWan0 could you please check and see if this issue is resolved and close it if so!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants