TY - JOUR T1 - Convergence Analysis of a Locally Accelerated Preconditioned Steepest Descent Method for Hermitian-Definite Generalized Eigenvalue Problems AU - Cai , Yunfeng AU - Bai , Zhaojun AU - Pask , John E. AU - Sukumar , N. JO - Journal of Computational Mathematics VL - 5 SP - 739 EP - 760 PY - 2018 DA - 2018/06 SN - 36 DO - http://doi.org/10.4208/jcm.1703-m2016-0580 UR - https://global-sci.org/intro/article_detail/jcm/12455.html KW - Eigenvalue problem, Steepest descent method, Preconditioning, Superlinear convergence. AB -
By extending the classical analysis techniques due to Samokish, Faddeev and Faddeeva, and Longsine and McCormick among others, we prove the convergence of the preconditioned steepest descent with implicit deflation (PSD-id) method for solving Hermitian-definite generalized eigenvalue problems. Furthermore, we derive a nonasymptotic estimate of the rate of convergence of the PSD-id method. We show that with a proper choice of the shift, the indefinite shift-and-invert preconditioner is a locally accelerated preconditioner, and is asymptotically optimal that leads to superlinear convergence. Numerical examples are presented to verify the theoretical results on the convergence behavior of the PSD-id method for solving ill-conditioned Hermitian-definite generalized eigenvalue problems arising from electronic structure calculations. While rigorous and full-scale convergence proofs of the preconditioned block steepest descent methods in practical use still largely elude us, we believe the theoretical results presented in this paper shed light on an improved understanding of the convergence behavior of these block methods.