Global convergence of a new class nonlinear conjugate gradient method with exact line search

Published on May 5, 2021
· DOI :10.1063/5.0053215
Nur Haziqah Mohd Dani , Srimazzura Basri2
Estimated H-index: 2
,
Mustafa Mamat15
Estimated H-index: 15
Source
Abstract
Unconstrained optimization is a widespread problem that can be solved by a mathematical technique known as the conjugate gradient method. This method is chosen because of its simplicity and less use of time in solving problems that can be seen when the result has less number of iteration with a faster time of the central processing unit (CPU). Motivated by this study, we are interested in researching as there are many modifications taking place in the conjugate gradient parameter. Therefore, in this study, five conjugate gradient parameters, including the preferred conjugate gradient parameter, modification of the Hestenes-Stiefel conjugate gradient parameter, will be analyzed. We focus on the problem of unconstrained optimization using the exact line search. The proof that this conjugate gradient parameter fulfilled the condition; global convergent condition under the exact line search will be shown. The performance of the conjugate gradient method with all conjugate gradient parameters was tested using 15 optimization test functions through MATLAB software to check whether the conjugate gradient method with the chosen conjugate gradient parameter could perform better and more efficiently than the conjugate gradient method with other conjugate gradient parameters based on the number of iterations and time. The conjugate gradient method’s accuracy and efficiency with each conjugate gradient parameter will be compared based on the percentage obtained in the cumulative frequency graph. The analysis shows that the conjugate gradient method’s performance with the chosen conjugate gradient parameter is more accurate and efficient than the conjugate gradient method with another conjugate gradient parameter.
📖 Papers frequently viewed together
References19
Newest
2 CitationsSource
#1Srimazzura Basri (UniSZA: Universiti Sultan Zainal Abidin)H-Index: 2
#2Mustafa Mamat (UTHM: Universiti Tun Hussein Onn Malaysia)
Last. Mustafa Mamat (UTHM: Universiti Tun Hussein Onn Malaysia)H-Index: 15
view all 2 authors...
Abstract Nonlinear conjugate gradient methods are widely used in solving large scale unconstrained optimization. Their wide application in many fields are due to their low memory. Numerous studies have been conducted recently to improve these methods. In this paper, a new class of conjugate gradient coefficients that possess global convergence properties is proposed. The global convergence result using exact line searches are discussed. Numerical result shows that the proposed method is more eff...
2 CitationsSource
#1Gonglin Yuan (Xida: Guangxi University)H-Index: 1
#2Wujie Hu (Xida: Guangxi University)H-Index: 4
For large-scale unconstrained optimization problems and nonlinear equations, we propose a new three-term conjugate gradient algorithm under the Yuan–Wei–Lu line search technique. It combines the steepest descent method with the famous conjugate gradient algorithm, which utilizes both the relevant function trait and the current point feature. It possesses the following properties: (i) the search direction has a sufficient descent feature and a trust region trait, and (ii) the proposed algorithm g...
7 CitationsSource
Conjugate gradient (CG) method is used to find the optimum solution for the large scale unconstrained optimization problems. Based on its simple algorithm, low memory requirement, and the speed of obtaining the solution, this method is widely used in many fields, such as engineering, computer science, and medical science. In this paper, we modified CG method to achieve the global convergence with various line searches. In addition, it passes the sufficient descent condition without any line sear...
3 CitationsSource
#1Mohamed HamodaH-Index: 3
#2Mustafa MamatH-Index: 15
Last. Zabidin SallehH-Index: 8
view all 4 authors...
In this paper, a modified conjugate gradient method is presented for solving large-scale unconstrained optimization problems, which possesses the sufficient descent property with Strong Wolfe-Powell line search. A global convergence result was proved when the (SWP) line search was used under some conditions. Computational results for a set consisting of 138 unconstrained optimization test problems showed that this new conjugate gradient algorithm seems to converge more stable and is superior to ...
3 CitationsSource
#2Mustafa MamatH-Index: 15
Last. Osman OmerH-Index: 2
view all 5 authors...
Conjugate gradient methods are effective in solving linear equations and solving non-linear optimization. In this work we compare our new conjugate gradient coefficient βk with classical formula under strong Wolfe line search; our method contains sufficient descent condition. Numerical results have shown that the new βk performs better than classical formula.Conjugate gradient methods are effective in solving linear equations and solving non-linear optimization. In this work we compare our new c...
11 CitationsSource
Test functions are important to validate and compare the performance of optimization algorithms. There have been many test or benchmark functions reported in the literature; however, there is no standard list or set of benchmark functions. Ideally, test functions should have diverse properties so that can be truly useful to test new algorithms in an unbiased way. For this purpose, we have reviewed and compiled a rich set of 175 benchmark functions for unconstrained optimization problems with div...
262 CitationsSource
#1Momin Jamil (BTH: Blekinge Institute of Technology)H-Index: 6
#2Xin-She Yang (Middlesex University)H-Index: 87
Test functions are important to validate and compare the performance of optimisation algorithms. There have been many test or benchmark functions reported in the literature; however, there is no standard list or set of benchmark functions. Ideally, test functions should have diverse properties to be truly useful to test new algorithms in an unbiased way. For this purpose, we have reviewed and compiled a rich set of 175 benchmark functions for unconstrained optimisation problems with diverse prop...
590 CitationsSource
#1Mohd RivaieH-Index: 7
#2Mustafa MamatH-Index: 15
Last. Ismail MohdH-Index: 8
view all 4 authors...
Conjugate gradient (CG) methods have played an important role in solving largescale unconstrained optimization due to its low memory requirements and global convergence properties. Numerous studies and modifications have been devoted recently to improve this method. In this paper, a new modification of conjugate gradient coefficient ( k β ) with global convergence properties are presented. The global convergence result is established using exact line searches. Preliminary result shows that the p...
4 Citations
#1Neculai AndreiH-Index: 20
A collection of unconstrained optimization test functions is presented. The purpose of this collection is to give to the optimization community a large number of general test functions to be used in testing the unconstrained optimization algorithms and comparisons studies. For each function we give its algebraic expression and the standard initial point. Some of the test fnctions are from the CUTE collection established by Bongartz, Conn, Gould and Toint, (1995), others are from More, Garbow and...
260 Citations
Cited By0
Newest