New modification of the Hestenes-Stiefel with strong Wolfe line search

Published on May 5, 2021
· DOI :10.1063/5.0053211
Nur Athira Japri , Srimazzura Basri2
Estimated H-index: 2
,
Mustafa Mamat15
Estimated H-index: 15
Source
Abstract
The method of the nonlinear conjugate gradient is widely used in solving large-scale unconstrained optimization since been proven in solving optimization problems without using large memory storage. In this paper, we proposed a new modification of the Hestenes-Stiefel conjugate gradient parameter that fulfils the condition of sufficient descent using a strong Wolfe-Powell line search. Besides, the conjugate gradient method with the proposed conjugate gradient also guarantees low computation of iteration and CPU time by comparing with other classical conjugate gradient parameters. Numerical results have shown that the conjugate gradient method with the proposed conjugate gradient parameter performed better than the conjugate gradient method with other classical conjugate gradient parameters.
📖 Papers frequently viewed together
8 Citations
20 Citations
References16
Newest
2 CitationsSource
#1Srimazzura Basri (UniSZA: Universiti Sultan Zainal Abidin)H-Index: 2
#2Mustafa Mamat (UTHM: Universiti Tun Hussein Onn Malaysia)H-Index: 15
Last. Mustafa Mamat (UTHM: Universiti Tun Hussein Onn Malaysia)
view all 2 authors...
Abstract Nonlinear conjugate gradient methods are widely used in solving large scale unconstrained optimization. Their wide application in many fields are due to their low memory. Numerous studies have been conducted recently to improve these methods. In this paper, a new class of conjugate gradient coefficients that possess global convergence properties is proposed. The global convergence result using exact line searches are discussed. Numerical result shows that the proposed method is more eff...
2 CitationsSource
#1Mohamed HamodaH-Index: 3
#2Mohd RivaieH-Index: 7
Last. Zabidin SallehH-Index: 8
view all 4 authors...
5 CitationsSource
#2Mustafa MamatH-Index: 15
Last. Osman OmerH-Index: 2
view all 5 authors...
Conjugate gradient methods are effective in solving linear equations and solving non-linear optimization. In this work we compare our new conjugate gradient coefficient βk with classical formula under strong Wolfe line search; our method contains sufficient descent condition. Numerical results have shown that the new βk performs better than classical formula.Conjugate gradient methods are effective in solving linear equations and solving non-linear optimization. In this work we compare our new c...
11 CitationsSource
Test functions are important to validate and compare the performance of optimization algorithms. There have been many test or benchmark functions reported in the literature; however, there is no standard list or set of benchmark functions. Ideally, test functions should have diverse properties so that can be truly useful to test new algorithms in an unbiased way. For this purpose, we have reviewed and compiled a rich set of 175 benchmark functions for unconstrained optimization problems with div...
262 CitationsSource
#1Momin Jamil (BTH: Blekinge Institute of Technology)H-Index: 6
#2Xin-She Yang (Middlesex University)H-Index: 87
Test functions are important to validate and compare the performance of optimisation algorithms. There have been many test or benchmark functions reported in the literature; however, there is no standard list or set of benchmark functions. Ideally, test functions should have diverse properties to be truly useful to test new algorithms in an unbiased way. For this purpose, we have reviewed and compiled a rich set of 175 benchmark functions for unconstrained optimisation problems with diverse prop...
590 CitationsSource
#1Mohd RivaieH-Index: 7
#2Mustafa MamatH-Index: 15
Last. Ismail MohdH-Index: 8
view all 4 authors...
Conjugate gradient (CG) methods have played an important role in solving largescale unconstrained optimization due to its low memory requirements and global convergence properties. Numerous studies and modifications have been devoted recently to improve this method. In this paper, a new modification of conjugate gradient coefficient ( k β ) with global convergence properties are presented. The global convergence result is established using exact line searches. Preliminary result shows that the p...
4 Citations
#1Neculai AndreiH-Index: 20
A collection of unconstrained optimization test functions is presented. The purpose of this collection is to give to the optimization community a large number of general test functions to be used in testing the unconstrained optimization algorithms and comparisons studies. For each function we give its algebraic expression and the standard initial point. Some of the test fnctions are from the CUTE collection established by Bongartz, Conn, Gould and Toint, (1995), others are from More, Garbow and...
260 Citations
#1Elizabeth D. Dolan (Argonne National Laboratory)H-Index: 6
#1Elizabeth D. Dolan (Argonne National Laboratory)H-Index: 3
Last. Jorge J. Moré (Argonne National Laboratory)H-Index: 50
view all 2 authors...
We propose performance profiles — distribution functions for a performance metric — as a tool for benchmarking and comparing optimization software. We show that performance profiles combine the best features of other tools for performance evaluation.
3,346 CitationsSource
Conjugate gradient methods are widely used for unconstrained optimization, especially large scale problems. The strong Wolfe conditions are usually used in the analyses and implementations of conjugate gradient methods. This paper presents a new version of the conjugate gradient method, which converges globally, provided the line search satisfies the standard Wolfe conditions. The conditions on the objective function are also weak, being similar to those required by the Zoutendijk condition.
550 CitationsSource
Cited By0
Newest