Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add full KKT differentiation mode, split into new-style PyTorch Module/Function, automatically infer some batch sizes #5

Merged
merged 7 commits into from
Aug 31, 2019

Conversation

bamos
Copy link
Collaborator

@bamos bamos commented Aug 22, 2019

No description provided.

@bamos bamos requested a review from bstellato August 22, 2019 02:05
@bamos
Copy link
Collaborator Author

bamos commented Aug 22, 2019

For some more details, the full backwards pass does something like this (can more properly write out if we end up using this):

image

I've added this mode to the simple test we have, and there's also a hack in here to handle the bounds at \pm\infty for now.

Something that I needed to do to get the active set method working is use LSQR to find the least-squares solution for some near-indefinite cases that the full solve failed on, although perhaps regularizing the system and doing a direct solve's OK too.

Also I've added in a --qp-solver osqpth option to the OptNet sudoku experiment here that by default uses the full backward pass and converges as with qpth. In comparison to what I sent in #4, this is now converging:

Using the full KKT system

(Keeps converging as we would expect)
image

Using the active set KKT system

image


So from here, I'll compare the derivatives from the active and full backwards passes for this training run and pull out a specific QP that has mis-matching derivatives to further debug what's going on. Maybe we can review/merge in this PR in and further diagnose/discuss this in #6?

@bstellato
Copy link
Collaborator

Sure thanks! Apologies for the delay. It is still quite obscure to me why this happens.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants