-
-
Notifications
You must be signed in to change notification settings - Fork 398
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[docs] add Transitioning from MATLAB tutorial #3698
Changes from 1 commit
9c102ca
7a6dff0
afca7fb
bf4fea0
79d21c9
3a7b92e
b972274
bb591aa
f93c351
2ff7ccc
14402f6
a167d84
f08c85a
fe8bf7c
d39e6f3
68a80e4
319e9ff
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -270,13 +270,58 @@ objective_value(model) | |
# end | ||
# ``` | ||
|
||
# ## Symmetric and Hermitian matrices | ||
|
||
# Julia has specialized support for symmetric and Hermitian matrices in the | ||
# `LinearAlgebra` package: | ||
|
||
import LinearAlgebra | ||
|
||
# If you have a matrix that is numerically symmetric: | ||
|
||
x = [1 2; 2 3] | ||
|
||
#- | ||
|
||
LinearAlgebra.issymmetric(x) | ||
|
||
# then you can wrap it in a `LinearAlgebra.Symmetric` matrix to tell Julia's | ||
# type system that the matrix is symmetric. | ||
|
||
LinearAlgebra.Symmetric(x) | ||
|
||
# Using a `Symmetric` matrix lets Julia and JuMP use more efficient algorithms | ||
# when they are working with symmetric matrices. | ||
|
||
# If you have a matrix that is nearly but not exactly symmetric: | ||
|
||
x = [1.0 2.0; 2.001 3.0] | ||
LinearAlgebra.issymmetric(x) | ||
|
||
# then you could, as you might do in MATLAB, make it numerically symmetric as | ||
# follows: | ||
|
||
x_sym = 0.5 * (x + x') | ||
|
||
# In Julia, you can explicitly choose whether to use the lower or upper triangle | ||
# of the matrix: | ||
|
||
x_sym = LinearAlgebra.Symmetric(x, :L) | ||
|
||
#- | ||
|
||
x_sym = LinearAlgebra.Symmetric(x, :U) | ||
|
||
# The same applies for Hermitian matrices, using `LinearAlgebra.Hermitian` and | ||
# `LinearAlgebra.ishermitian`. | ||
|
||
# ## Primal versus dual form | ||
|
||
# When you translate some optimization problems from YALMIP or CVX to JuMP, you | ||
# might be surprised to see it get much faster or much slower, even if you're | ||
# When you translate some optimization problems from YALMIP or CVX to JuMP, you | ||
# might be surprised to see it get much faster or much slower, even if you're | ||
# using exactly the same solver. The most likely reason is that YALMIP will | ||
# always interpret the problem as the dual form, whereas CVX and JuMP will try to | ||
# interpret the problem in the form most appropriate to the solver. If the | ||
# interpret the problem in the form most appropriate to the solver. If the | ||
# problem is more naturally formulated in the primal form it is likely that | ||
# YALMIP's performance will suffer, or if JuMP gets it wrong, its performance will | ||
# suffer. It might be worth trying both primal and dual forms if you're having | ||
|
@@ -334,7 +379,7 @@ function robustness_jump(d) | |
optimize!(model) | ||
if is_solved_and_feasible(model) | ||
WT = dual(PPT) | ||
return value(λ), LinearAlgebra.dot(WT, rhoT) | ||
return value(λ), real(LinearAlgebra.dot(WT, rhoT)) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Why did add There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Because the docs currently build with Julia 1.10 and we support Julia 1.6. Also, v1.11 isn't released yet. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. What's the problem? It doesn't cause any error with older versions of Julia, the output is just slightly less neat. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
julia> import LinearAlgebra
julia> A = LinearAlgebra.Hermitian(rand(ComplexF64, (3, 3)))
3×3 LinearAlgebra.Hermitian{ComplexF64, Matrix{ComplexF64}}:
0.468242+0.0im 0.296799+0.49972im 0.621408+0.148878im
0.296799-0.49972im 0.413853+0.0im 0.677598+0.875476im
0.621408-0.148878im 0.677598-0.875476im 0.59341+0.0im
julia> LinearAlgebra.dot(A, A)
4.6860981128420285 + 0.0im
julia> real(LinearAlgebra.dot(A, A))
4.6860981128420285 There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I know. This is not an error, and will change anyway in Julia 1.11. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
I know because I saw your PR 😄 My point is that I personally would argue that changing from There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I really don't understand the problem. It works. It's just slightly uglier. And moreover it's just example code in a tutorial. Nothing depends on whether its return value is There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It doesn't work. One return a There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Why is this a problem? |
||
else | ||
return "Something went wrong: $(raw_status(model))" | ||
end | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What I wanted to point out here is that with MATLAB you have to be paranoid about making your matrices exactly Hermitian, in order to not get subtle errors, whereas with JuMP you just need to wrap them and you're good to go.
But now I've tested it, and it turns out that it's not true. If you try to constrain a non-Hermitian matrix to be in HermitianPSDCone() you get an error message, as I expected, but if you try to constrain a non-Symmetric matrix to be in PSDCone() JuMP will let you (and fail afterwards, of course). Is that intentional?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For non-symmetric matrices, JuMP adds equality constraints for the off-diagonal elements. It lets you write
We don't have the same for Hermitian because there is no
HermitianPostiveSemidefininteConeSquare
(I'm sure we've discussed, but I can't find the link. Edit: found it: #3369 (comment)).There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why? Is there any case where you would need that? As far as I can see this can either generate less efficient code or give you an error.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Regardless of the need, it exists in JuMP 1.0 so we are not changing it.
Someone could also write
[x y; 1 z] in PSDCone()
and we would add a constraint thaty == 1
. It may have been sensible to support only the triangular form of PSD, but we didn't do that, so it's no longer up for discussion.We added Hessian support after 1.0, which is why it supports only the triangular form.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was just curious, I wasn't suggesting to change this, it would clearly be a breaking change.
Although now that you brought it up, I will definitely suggest removing it when the time comes for JuMP 2.0.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My goal is not to release JuMP 2.0 if at all possible. If we did, it would be for a minimal set of breaking changes, not just to remove some things we don't like. The only candidate was the nonlinear changes, but we seem to have managed to sneak them in with minimal disruption. The JuMP developers have no plans for JuMP 2.0.