TY - JOUR T1 - A Note on the Nonlinear Conjugate Gradient Method AU - , Yu-Hong Dai AU - Yuan , Ya-Xiang JO - Journal of Computational Mathematics VL - 6 SP - 575 EP - 582 PY - 2002 DA - 2002/12 SN - 20 DO - http://doi.org/ UR - https://global-sci.org/intro/article_detail/jcm/8942.html KW - Unconstrained optimization, Conjugate gradient, Line search, Global convergence. AB -

The conjugate gradient method for unconstrained optimization problems varies with a scalar. In this note, a general condition concerning the scalar is given, which ensures the global convergence of the method in the case of strong Wolfe line searches. It is also discussed how to use the result to obtain the convergence of the famous Fletcher-Reeves, and Polak-RibiƩre-Polyak conjugate gradient methods. That the condition cannot be relaxed in some sense is mentioned.