A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
Abstract
A new nonlinear conjugate gradient method and an associated implementation, based on an inexact line search, are proposed and analyzed. With exact line search, our method reduces to a nonlinear version of the Hestenes--Stiefel conjugate gradient scheme. For any (inexact) line search, our scheme satisfies the descent condition {\bf g}_k^{\sf T} {\bf d}_k \le -\frac{7}{8} \|{\bf g}_k\|^2 Moreover, a global convergence result is established when...
Paper Details
Title
A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
Published Date
Jan 1, 2005
Journal
Volume
16
Issue
1
Pages
170 - 192
Citation AnalysisPro
You’ll need to upgrade your plan to Pro
Looking to understand the true influence of a researcher’s work across journals & affiliations?
- Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
- Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.
Notes
History