Volume 20, Issue 6
A Note on the Nonlinear Conjugate Gradient Method

Yu Hong Dai & Ya Xiang Yuan

DOI:

J. Comp. Math., 20 (2002), pp. 575-582

Published online: 2002-12

Preview Full PDF 134 1169
Export citation
  • Abstract

The conjugate gradient method for unconstrained optimization problems varies with a scalar. In this note, a general condition concerning the scalar is given, which ensures the global convergence of the method in the case of strong Wolfe line searches.It is also discussed how to rse the result to obtain the convergence of the famous Fletcher-Reeves, and Polak-Ribi$\acute{e}$re-Polyak sonjugate gradient methods. That the condition cannot be relaxed in some sense iis mentioned.

  • Keywords

Unconstrained optimization Conjugate gradient Line search Global convergence

  • AMS Subject Headings

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{JCM-20-575, author = {}, title = {A Note on the Nonlinear Conjugate Gradient Method}, journal = {Journal of Computational Mathematics}, year = {2002}, volume = {20}, number = {6}, pages = {575--582}, abstract = { The conjugate gradient method for unconstrained optimization problems varies with a scalar. In this note, a general condition concerning the scalar is given, which ensures the global convergence of the method in the case of strong Wolfe line searches.It is also discussed how to rse the result to obtain the convergence of the famous Fletcher-Reeves, and Polak-Ribi$\acute{e}$re-Polyak sonjugate gradient methods. That the condition cannot be relaxed in some sense iis mentioned. }, issn = {1991-7139}, doi = {https://doi.org/}, url = {http://global-sci.org/intro/article_detail/jcm/8942.html} }
TY - JOUR T1 - A Note on the Nonlinear Conjugate Gradient Method JO - Journal of Computational Mathematics VL - 6 SP - 575 EP - 582 PY - 2002 DA - 2002/12 SN - 20 DO - http://doi.org/ UR - https://global-sci.org/intro/article_detail/jcm/8942.html KW - Unconstrained optimization KW - Conjugate gradient KW - Line search KW - Global convergence AB - The conjugate gradient method for unconstrained optimization problems varies with a scalar. In this note, a general condition concerning the scalar is given, which ensures the global convergence of the method in the case of strong Wolfe line searches.It is also discussed how to rse the result to obtain the convergence of the famous Fletcher-Reeves, and Polak-Ribi$\acute{e}$re-Polyak sonjugate gradient methods. That the condition cannot be relaxed in some sense iis mentioned.
Yu Hong Dai & Ya Xiang Yuan. (1970). A Note on the Nonlinear Conjugate Gradient Method. Journal of Computational Mathematics. 20 (6). 575-582. doi:
Copy to clipboard
The citation has been copied to your clipboard