A rapidly convergence algorithm for linear search and its application
Numer. Math. J. Chinese Univ. (English Ser.)(English Ser.) 15 (2006), pp. 299-305
Published online: 2006-11
Cited by
Export citation
- BibTex
- RIS
- TXT
@Article{NM-15-299,
author = {J. Li, H. Zhu, X. Zhou and W. Song },
title = {A rapidly convergence algorithm for linear search and its application},
journal = {Numerical Mathematics, a Journal of Chinese Universities},
year = {2006},
volume = {15},
number = {4},
pages = {299--305},
abstract = {
The essence of the linear search is
one-dimension nonlinear minimization problem, which is an important part of
the multi-nonlinear optimization, it will be spend the most of operation
count for solving optimization problem. To improve the efficiency, we set
about from quadratic interpolation, combine the advantage of the quadratic
convergence rate of Newton's method and adopt the idea of
Anderson-Bjorck extrapolation, then we present a rapidly convergence
algorithm and give its corresponding convergence conclusions. Finally we did
the numerical experiments with the some well-known test functions for
optimization and the application test of the ANN learning examples. The
experiment results showed the validity of the algorithm.
},
issn = {},
doi = {https://doi.org/},
url = {http://global-sci.org/intro/article_detail/nm/8037.html}
}
TY - JOUR
T1 - A rapidly convergence algorithm for linear search and its application
AU - J. Li, H. Zhu, X. Zhou & W. Song
JO - Numerical Mathematics, a Journal of Chinese Universities
VL - 4
SP - 299
EP - 305
PY - 2006
DA - 2006/11
SN - 15
DO - http://doi.org/
UR - https://global-sci.org/intro/article_detail/nm/8037.html
KW -
AB -
The essence of the linear search is
one-dimension nonlinear minimization problem, which is an important part of
the multi-nonlinear optimization, it will be spend the most of operation
count for solving optimization problem. To improve the efficiency, we set
about from quadratic interpolation, combine the advantage of the quadratic
convergence rate of Newton's method and adopt the idea of
Anderson-Bjorck extrapolation, then we present a rapidly convergence
algorithm and give its corresponding convergence conclusions. Finally we did
the numerical experiments with the some well-known test functions for
optimization and the application test of the ANN learning examples. The
experiment results showed the validity of the algorithm.
J. Li, H. Zhu, X. Zhou and W. Song . (2006). A rapidly convergence algorithm for linear search and its application.
Numerical Mathematics, a Journal of Chinese Universities. 15 (4).
299-305.
doi:
Copy to clipboard