-
Notifications
You must be signed in to change notification settings - Fork 0
/
publication.html
191 lines (123 loc) · 5.19 KB
/
publication.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en">
<head>
<meta name="generator" content="jemdoc, see http://jemdoc.jaboc.net/" />
<meta http-equiv="Content-Type" content="text/html;charset=utf-8" />
<link rel="stylesheet" href="jemdoc.css" type="text/css" />
<title>Research</title>
</head>
<body>
<table summary="Table for page layout." id="tlayout">
<tr valign="top">
<td id="layout-menu">
<div class="menu-category">Siqi Zhang</div>
<div class="menu-item"><a href="index.html">Home</a></div>
<div class="menu-item"><a href="bio.html">Biography</a></div>
<div class="menu-item"><a href="publication.html">Research</a></div>
<div class="menu-item"><a href="teaching.html">Teaching</a></div>
<div class="menu-category">Misc</div>
<div class="menu-item"><a href="links.html">Useful Links</a></div>
</td>
<td id="layout-content">
<div id="toptitle">
<h1>Research</h1>
<div id="subtitle"></div>
</div>
<h2>Publications (<a href="https://scholar.google.com/citations?user=0M171lEAAAAJ&hl=en" target=“blank”>Google Scholar</a>)</h2>
<p></p>
<dl>
<dt>Generalization Bounds of Nonconvex-(Strongly)-Concave Stochastic Minimax Optimization</dt>
<dd><p>
<b>Siqi Zhang*</b>, Yifan Hu*, Liang Zhang, Niao He. <br />
<i>International Conference on Artificial Intelligence and Statistics (AISTATS) 2024</i>
<a href="https://arxiv.org/abs/2205.14278" target=“blank”> [arXiv]</a> <a href="https://opt-ml.org/papers/2022/paper70.pdf" target=“blank”> [Workshop]</a>
<br />
</p></dd>
<dt>Communication-Efficient Gradient Descent-Accent Methods for Distributed Variational Inequalities: Unified Analysis and Local Updates</dt>
<dd><p>
<b>Siqi Zhang*</b>, Sayantan Choudhury*, Sebastian U Stich, Nicolas Loizou. <br />
<i>International Conference on Learning Representations (ICLR) 2024</i>
<a href="https://arxiv.org/abs/2306.05100" target=“blank”> [arXiv]</a> <a href="https://opt-ml.org/papers/2022/paper84.pdf" target=“blank”> [Workshop]</a>
<br />
</p></dd>
<dt>The Complexity of Nonconvex-Strongly-Concave Minimax Optimization</dt>
<dd><p>
<b>Siqi Zhang*</b>, Junchi Yang*, Cristobal Guzman, Negar Kiyavash and Niao He. <br />
<i>Uncertainty in Artificial Intelligence (UAI) 2021</i>
<a href="https://arxiv.org/abs/2103.15888" target=“blank”> [arXiv]</a> <a href="https://proceedings.mlr.press/v161/zhang21c.html" target=“blank”> [UAI]</a>
<br />
</p></dd>
<dt>Biased Stochastic First-Order Methods for Conditional Stochastic Optimization and Applications in Meta Learning</dt>
<dd><p>
Yifan Hu*, <b>Siqi Zhang*</b>, Xin Chen, Niao He. <br />
<i>Neural Information Processing Systems (NeurIPS) 2020</i>
<a href="https://arxiv.org/abs/2002.10790" target=“blank”> [arXiv]</a> <a href="https://proceedings.neurips.cc/paper/2020/hash/1cdf14d1e3699d61d237cf76ce1c2dca-Abstract.html" target=“blank”> [NeurIPS]</a>
<br />
</p></dd>
<dt>A Catalyst Framework for Minimax Optimization</dt>
<dd><p>
Junchi Yang, <b>Siqi Zhang</b>, Negar Kiyavash, Niao He. <br />
<i>Neural Information Processing Systems (NeurIPS) 2020</i>
<a href="https://proceedings.neurips.cc/paper/2020/hash/3db54f5573cd617a0112d35dd1e6b1ef-Abstract.html" target=“blank”> [NeurIPS]</a>
<br /></p></dd>
<dt><b></b>(* denotes equal contributions)<b></b></dt>
<dd><p>
</p></dd>
</dl>
<h2>Preprints</h2>
<p></p>
<dl>
<dt>First-Order Optimization Inspired from Finite-Time Convergent Flows</dt>
<dd><p>
<b>Siqi Zhang</b>, Mouhacine Benosman, Orlando Romero, Anoop Cherian. 2021
<a href="https://arxiv.org/abs/2010.02990" target=“blank”> [arXiv]</a>
</p></dd>
<dt>On the Convergence Rate of Stochastic Mirror Descent for Nonsmooth Nonconvex Optimization</dt>
<dd><p>
<b>Siqi Zhang</b>, Niao He. 2018
<a href="https://arxiv.org/abs/1806.04781" target=“blank”> [arXiv]</a>
</p></dd>
</dl>
<h2>Selected Talks</h2>
<dl>
<dt><i>Generalization in Nonconvex Minimax Optimization: A Comprehensive Study</i></dt>
<dd><p>
INFORMS Optimization Society Conference, Houston, USA, 2024.03
<br />
CAMDA Conference, College Station, USA, 2023.05
</p></dd>
<dt><i>Communication-Efficient Federated Learning Algorithms for Variational Inequalities</i></dt>
<dd><p>
SIAM Conference on Optimization (OP23), Seattle, USA, 2023.06
<br />
Annual Conference on Information Sciences and Systems (CISS 2023), Baltimore, USA, 2023.03
</p></dd>
<dt><i>Nonconvex Minimax Optimization: Fundamental Limits, Efficient Algorithms and Generalization</i></dt>
<dd><p>
INFORMS Annual Meeting, Indianapolis, USA, 2022.10
<br />
MINDS & CIS Seminar, JHU, Baltimore, USA, 2022.10
<br />
International Conference on Continuous Optimization (ICCOPT), Bethlehem, USA, 2022.07
</p></dd>
</dl>
<div id="footer">
<div id="footer-text">
Page generated 2024-11-28, by <a href="https://github.com/wsshin/jemdoc_mathjax" target="blank">jemdoc+MathJax</a>.
</div>
</div>
</td>
</tr>
</table>
</body>
</html>