Global convergence result for conjugate gradient methods

Volume: 71, Issue: 2, Pages: 399 - 405
Published: Nov 1, 1991
Abstract
Conjugate gradient optimization algorithms depend on the search directions, $\begin{gathered} s^{(1)} = - g^{(1)} , \hfill s^{(k + 1)} = - g^{(k + 1)} + \beta ^{(k)} s^{(k)} ,k \geqslant 1, \hfill \end{gathered} with different methods arising from different choices for the scalar β(k). In this note, conditions are given on β(k) to ensure global convergence of the resulting...
Paper Details
Title
Global convergence result for conjugate gradient methods
Published Date
Nov 1, 1991
Volume
71
Issue
2
Pages
399 - 405
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.