Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization

Volume: 230, Issue: 2, Pages: 570 - 582
Published: Aug 1, 2009
Abstract
In this paper we propose a fundamentally different conjugate gradient method, in which the well-known parameter βk is computed by an approximation of the Hessian/vector product through finite differences. For search direction computation, the method uses a forward difference approximation to the Hessian/vector product in combination with a careful choice of the finite difference interval. For the step length computation we suggest an...
Paper Details
Title
Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization
Published Date
Aug 1, 2009
Volume
230
Issue
2
Pages
570 - 582
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.