The Convergence Properties of a New Kind of Conjugate Gradient Method for Unconstrained Optimization

Published on Jan 1, 2015in Applied mathematical sciences
· DOI :10.12988/AMS.2015.411997
Rabi’u Bashir Yunus1
Estimated H-index: 1
,
Mustafa Mamat16
Estimated H-index: 16
+ 3 AuthorsZahrahtul Amani Zakaria7
Estimated H-index: 7
Sources
Abstract
Conjugate gradient (CG) methods are the most prominent technique for solving large-scale unconstrained optimization problems, due to its robustness, low memory requirement, and global convergence properties. Numerous studies and modifications have been carried out recently to improve these methods. In this paper, a new modification of a CG coefficient that possesses the global convergence properties is presented. The global convergence result is validated using exact line search. Several numerical experiments showed that, the proposed formula is found to be robust and efficient when compared to other CG coefficients.
📖 Papers frequently viewed together
2012
4 Authors (Mohd Rivaie, ..., Ismail Mohd)
2015
4 Authors (Ahmad Alhawarat, ..., Ismail Mohd)
References21
Newest
#2Mustafa MamatH-Index: 16
Last. Ismail MohdH-Index: 8
view all 4 authors...
Nonlinear conjugate gradient (CG) methods are widely used for solving large scale unconstrained optimization problems. Many studies have been devoted to modified and improve this method. In this paper, a new parameter of CG method that possesses global convergence properties using exact line search is proposed. Numerical results show that the new formula is best and more efficient when compared with the other classical CG methods
Source
#1Mohd RivaieH-Index: 8
Last. Ismail MohdH-Index: 8
view all 4 authors...
Conjugate gradient (CG) methods are widely used for solving large scale unconstrained optimization problems. Many studies have been devoted to develop and improve these methods. In this paper, we compare our new CG coefficient
Source
#1Mohd Rivaie (UiTM: Universiti Teknologi MARA)H-Index: 8
#2Mustafa Mamat (UMT: Universiti Malaysia Terengganu)H-Index: 16
Last. Ismail Mohd (UMT: Universiti Malaysia Terengganu)H-Index: 8
view all 4 authors...
Abstract Nonlinear conjugate gradient (CG) methods have played an important role in solving large-scale unconstrained optimization. Their wide application in many fields is due to their low memory requirements and global convergence properties. Numerous studies and modifications have been conducted recently to improve this method. In this paper, a new class of conjugate gradient coefficients ( β k ) that possess global convergence properties is presented. The global convergence result is establi...
Source
#1Li Zhang (CSUST: Changsha University of Science and Technology)H-Index: 3
In this paper, we take a little modification to the Wei-Yao-Liu nonlinear conjugate gradient method proposed by Wei et al. [Z. Wei, S. Yao, L. Liu, The convergence properties of some new conjugate gradient methods, Appl. Math. Comput. 183 (2006) 1341-1350] such that the modified method possesses better convergence properties. In fact, we prove that the modified method satisfies sufficient descent condition with greater parameter @[email protected]?0,12 in the strong Wolfe line search and converg...
Source
#1Neculai AndreiH-Index: 21
A collection of unconstrained optimization test functions is presented. The purpose of this collection is to give to the optimization community a large number of general test functions to be used in testing the unconstrained optimization algorithms and comparisons studies. For each function we give its algebraic expression and the standard initial point. Some of the test fnctions are from the CUTE collection established by Bongartz, Conn, Gould and Toint, (1995), others are from More, Garbow and...
#1Zengxin Wei (Xida: Guangxi University)H-Index: 21
#2Shengwei Yao (Xida: Guangxi University)H-Index: 2
Last. Liying Liu (Liaocheng University)H-Index: 3
view all 3 authors...
Abstract In this paper, a new conjugate gradient formula β k ∗ is given to compute the search directions for unconstrained optimization problems. General convergence results for the proposed formula with some line searches such as the exact line search, the Wolfe–Powell line search and the Grippo–Lucidi line search are discussed. Under the above line searches and some assumptions, the global convergence properties of the given methods are discussed. The given formula β k ∗ ⩾ 0 , and has the simi...
Source
A new nonlinear conjugate gradient method and an associated implementation, based on an inexact line search, are proposed and analyzed. With exact line search, our method reduces to a nonlinear version of the Hestenes--Stiefel conjugate gradient scheme. For any (inexact) line search, our scheme satisfies the descent condition {\bf g}_k^{\sf T} {\bf d}_k \le -\frac{7}{8} \|{\bf g}_k\|^2 Moreover, a global convergence result is established when the line search fulfills the Wolfe conditions. A n...
Source
#1William W. Hager (UF: University of Florida)H-Index: 57
#2Hongchao Zhang (UF: University of Florida)H-Index: 25
This paper reviews the development of dierent versions of nonlinear conjugate gradient methods, with special attention given to global convergence properties.
#1Mohd RivaieH-Index: 8
#2Mustafa MamatH-Index: 16
Last. Muhammad FauziH-Index: 4
view all 4 authors...
Conjugate gradient methods are popular in the field of unconstrained optimization. Numerous studies have been devoted recently to improve this method. In this paper three of our new propose conjugate gradient coefficient (βk) have been compared with the six most common (βk) proposed by the early researches. The first proposed method is based from the reciprocal of the summation of the eigenvalues. The second and third proposed methods are based from the modification of the original Polak-Ribiere...
#1Elizabeth D. Dolan (Argonne National Laboratory)H-Index: 3
#2Jorge J. Moré (Argonne National Laboratory)H-Index: 51
We propose performance profiles — distribution functions for a performance metric — as a tool for benchmarking and comparing optimization software. We show that performance profiles combine the best features of other tools for performance evaluation.
Source
Cited By1
Newest
#1Norrlaili ShapieeH-Index: 3
#2Mohd RivaieH-Index: 8
Last. Mustafa MamatH-Index: 16
view all 3 authors...
In this paper, we proposed a new classical conjugate gradient method. The global convergence is established using exact line search. Numerical results are presented based on number of iterations and CPU time. This numerical result shows that our method is performs better than classical CG method for a given standard test problems.
Source
This website uses cookies.
We use cookies to improve your online experience. By continuing to use our website we assume you agree to the placement of these cookies.
To learn more, you can find in our Privacy Policy.