Volume 35, Issue 4
Linearly Convergent First-Order Algorithms for Semidefinite Programming
10.4208/jcm.1612-m2016-0703

J. Comp. Math., 35 (2017), pp. 452-468.

Preview Full PDF BiBTex 2 404
• Abstract

In this paper, we consider two different formulations (one is smooth and the other one is nonsmooth) for solving linear matrix inequalities (LMIs), an important class of semidefinite programming (SDP), under a certain Slater constraint qualification assumption. We then propose two first-order methods, one based on subgradient method and the other based on Nesterov's optimal method, and show that they converge linearly for solving these formulations. Moreover, we introduce an accelerated prox-level method which converges linearly uniformly for both smooth and non-smooth problems without requiring the input of any problem parameters. Finally, we consider a special case of LMIs, i.e., linear system of inequalities, and show that a linearly convergent algorithm can be obtained under a much weaker assumption.

• History

Published online: 2017-08

• Keywords