Acceleration of conjugate gradient algorithms for unconstrained optimization

Volume: 213, Issue: 2, Pages: 361 - 369
Published: Jul 1, 2009
Abstract
Conjugate gradient methods are important for large-scale unconstrained optimization. This paper proposes an acceleration of these methods using a modification of steplength. The idea is to modify in a multiplicative manner the steplength αk, computed by Wolfe line search conditions, by means of a positive parameter ηk, in such a way to improve the behavior of the classical conjugate gradient algorithms. It is shown that for uniformly convex...
Paper Details
Title
Acceleration of conjugate gradient algorithms for unconstrained optimization
Published Date
Jul 1, 2009
Volume
213
Issue
2
Pages
361 - 369
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.