A new spectral conjugate gradient method with descent condition and global convergence property for unconstrained optimization
Abstract
The Spectral conjugate gradient method is an efficient method for solving large-scale unconstrained optimization problems. In this paper, we propose a new spectral conjugate gradient method in which performance is analyzed numerically. We establish the descent condition and global convergence property under some assumptions and the strong Wolfe line search. Numerical experiments to evaluate the method’s efficiency are conducted using 98 problems...
Paper Details
Title
A new spectral conjugate gradient method with descent condition and global convergence property for unconstrained optimization
Published Date
Aug 13, 2020
Journal
Volume
10
Issue
5
Pages
2053 - 2069
Citation AnalysisPro
You’ll need to upgrade your plan to Pro
Looking to understand the true influence of a researcher’s work across journals & affiliations?
- Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
- Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.
Notes
History