- Journal Home
- Volume 43 - 2025
- Volume 42 - 2024
- Volume 41 - 2023
- Volume 40 - 2022
- Volume 39 - 2021
- Volume 38 - 2020
- Volume 37 - 2019
- Volume 36 - 2018
- Volume 35 - 2017
- Volume 34 - 2016
- Volume 33 - 2015
- Volume 32 - 2014
- Volume 31 - 2013
- Volume 30 - 2012
- Volume 29 - 2011
- Volume 28 - 2010
- Volume 27 - 2009
- Volume 26 - 2008
- Volume 25 - 2007
- Volume 24 - 2006
- Volume 23 - 2005
- Volume 22 - 2004
- Volume 21 - 2003
- Volume 20 - 2002
- Volume 19 - 2001
- Volume 18 - 2000
- Volume 17 - 1999
- Volume 16 - 1998
- Volume 15 - 1997
- Volume 14 - 1996
- Volume 13 - 1995
- Volume 12 - 1994
- Volume 11 - 1993
- Volume 10 - 1992
- Volume 9 - 1991
- Volume 8 - 1990
- Volume 7 - 1989
- Volume 6 - 1988
- Volume 5 - 1987
- Volume 4 - 1986
- Volume 3 - 1985
- Volume 2 - 1984
- Volume 1 - 1983
J. Comp. Math., 42 (2024), pp. 1605-1626.
Published online: 2024-11
Cited by
- BibTex
- RIS
- TXT
The alternating direction method of multipliers (ADMM) has been extensively investigated in the past decades for solving separable convex optimization problems, and surprisingly, it also performs efficiently for nonconvex programs. In this paper, we propose a symmetric ADMM based on acceleration techniques for a family of potentially nonsmooth and nonconvex programming problems with equality constraints, where the dual variables are updated twice with different stepsizes. Under proper assumptions instead of the so-called Kurdyka-Lojasiewicz inequality, convergence of the proposed algorithm as well as its pointwise iteration-complexity are analyzed in terms of the corresponding augmented Lagrangian function and the primal-dual residuals, respectively. Performance of our algorithm is verified by numerical examples corresponding to signal processing applications in sparse nonconvex/convex regularized minimization.
}, issn = {1991-7139}, doi = {https://doi.org/10.4208/jcm.2305-m2021-0107}, url = {http://global-sci.org/intro/article_detail/jcm/23509.html} }The alternating direction method of multipliers (ADMM) has been extensively investigated in the past decades for solving separable convex optimization problems, and surprisingly, it also performs efficiently for nonconvex programs. In this paper, we propose a symmetric ADMM based on acceleration techniques for a family of potentially nonsmooth and nonconvex programming problems with equality constraints, where the dual variables are updated twice with different stepsizes. Under proper assumptions instead of the so-called Kurdyka-Lojasiewicz inequality, convergence of the proposed algorithm as well as its pointwise iteration-complexity are analyzed in terms of the corresponding augmented Lagrangian function and the primal-dual residuals, respectively. Performance of our algorithm is verified by numerical examples corresponding to signal processing applications in sparse nonconvex/convex regularized minimization.