A New Conjugate Gradient Coefficient for Large Scale Nonlinear Unconstrained Optimization

Published on Jan 1, 2012
Mohd Rivaie7
Estimated H-index: 7
,
Mustafa Mamat15
Estimated H-index: 15
+ 1 AuthorsIsmail Mohd8
Estimated H-index: 8
Source
Abstract
Conjugate gradient (CG) methods have played an important role in solving largescale unconstrained optimization due to its low memory requirements and global convergence properties. Numerous studies and modifications have been devoted recently to improve this method. In this paper, a new modification of conjugate gradient coefficient ( k β ) with global convergence properties are presented. The global convergence result is established using exact line searches. Preliminary result shows that the proposed formula is competitive when compared to the other CG coefficients. Mathematics Subject Classification: 65K10, 49M37
📖 Papers frequently viewed together
56 Citations
15 Citations
2 Citations
References19
Newest
#1Norrlaili Shapiee (UMT: Universiti Malaysia Terengganu)H-Index: 3
Last. Zainal AbidinH-Index: 2
view all 5 authors...
Abstract Conjugate gradient method holds an important role in solving unconstrained optimizations, especially for large scale problems. Numerous studies and modifications have been done to improve this method. In this paper, we propose a fundamentally different conjugate gradient method in which the well known coefficient β k is computed using the eigenvalues generated by using exact Hessian matrix of f ( x ) . This relatively easy formula for β k has made conjugate gradient method very simple, ...
10 CitationsSource
#1Yu-Hong Dai (CAS: Chinese Academy of Sciences)H-Index: 35
Conjugate gradient methods are a class of important methods for solving linear equations and for solving nonlinear optimization. In this article, a review on conjugate gradient methods for unconstrained optimization is given. They are divided into early conjugate gradient methods, descent conjugate gradient methods, and sufficient descent conjugate gradient methods. Two general convergence theorems are provided for the conjugate gradient method assuming the descent property of each search direct...
135 CitationsSource
Abstract Based on the modified secant equation, we propose two new HS type conjugate gradient formulas. Their forms are similar to the original HS conjugate gradient formula and inherit all nice properties of the HS method. By utilizing the technique of the three-term HS method in Zhang et al. (2007) [15] , without the requirement of truncation and convexity of the objective function, we show that one with Wolfe line search and the other with Armijo line search are globally convergent. Moreover,...
23 CitationsSource
#1Gonglin Yuan (Xida: Guangxi University)H-Index: 3
#2Sha LuH-Index: 3
Last. Zengxin WeiH-Index: 1
view all 3 authors...
It is well known that the line search methods play a very important role for optimization problems. In this paper a new line search method is proposed for solving unconstrained optimization. Under weak conditions, this method possesses global convergence and R-linear convergence for nonconvex function and convex function, respectively. Moreover, the given search direction has sufficiently descent property and belongs to a trust region without carrying out any line search rule. Numerical results ...
15 CitationsSource
#1Gonglin Yuan (Xida: Guangxi University)H-Index: 8
#2Xiwen Lu (ECUST: East China University of Science and Technology)H-Index: 3
Last. Zengxin Wei (Xida: Guangxi University)H-Index: 22
view all 3 authors...
A modified conjugate gradient method is presented for solving unconstrained optimization problems, which possesses the following properties: (i) The sufficient descent property is satisfied without any line search; (ii) The search direction will be in a trust region automatically; (iii) The Zoutendijk condition holds for the Wolfe-Powell line search technique; (iv) This method inherits an important property of the well-known Polak-Ribiere-Polyak (PRP) method: the tendency to turn towards the ste...
69 CitationsSource
#1Radosław PytlakH-Index: 1
Conjugate Direction Methods for Quadratic Problems.- Conjugate Gradient Methods for Nonconvex Problems.- Memoryless Quasi-Newton Methods.- Preconditioned Conjugate Gradient Algorithms.- Limited Memory Quasi-Newton Algorithms.- The Method of Shortest Residuals and Nondifferentiable Optimization.- The Method of Shortest Residuals for Differentiable Problems.- The Preconditioned Shortest Residuals Algorithm.- Optimization on a Polyhedron.- Conjugate Gradient Algorithms for Problems with Box Constra...
40 Citations
#1Neculai AndreiH-Index: 20
A collection of unconstrained optimization test functions is presented. The purpose of this collection is to give to the optimization community a large number of general test functions to be used in testing the unconstrained optimization algorithms and comparisons studies. For each function we give its algebraic expression and the standard initial point. Some of the test fnctions are from the CUTE collection established by Bongartz, Conn, Gould and Toint, (1995), others are from More, Garbow and...
295 Citations
#1Elizabeth D. Dolan (Argonne National Laboratory)H-Index: 6
#1Elizabeth D. Dolan (Argonne National Laboratory)H-Index: 3
Last. Jorge J. Moré (Argonne National Laboratory)H-Index: 50
view all 2 authors...
We propose performance profiles — distribution functions for a performance metric — as a tool for benchmarking and comparing optimization software. We show that performance profiles combine the best features of other tools for performance evaluation.
2,603 CitationsSource
#1Jean Charles Gilbert (IRIA: French Institute for Research in Computer Science and Automation)H-Index: 10
#2Jorge NocedalH-Index: 58
This paper explores the convergence of nonlinear conjugate gradient methods without restarts, and with practical line searches. The analysis covers two classes of methods that are globally convergent on smooth, nonconvex functions. Some properties of the Fletcher–Reeves method play an important role in the first family, whereas the second family shares an important property with the Polak–Ribiere method. Numerical experiments are presented.
727 CitationsSource
#1Y. Liu (Lboro: Loughborough University)H-Index: 1
#2C. Storey (Lboro: Loughborough University)H-Index: 7
The effect of inexact line search on conjugacy is studied in unconstrained optimization. A generalized conjugate gradient method based on this effect is proposed and shown to have global convergence for a twice continuously differentiable function with a bounded level set.
307 CitationsSource
Cited By10
Newest
#2Srimazzura BasriH-Index: 2
Last. Mustafa MamatH-Index: 15
view all 3 authors...
Unconstrained optimization is a widespread problem that can be solved by a mathematical technique known as the conjugate gradient method. This method is chosen because of its simplicity and less use of time in solving problems that can be seen when the result has less number of iteration with a faster time of the central processing unit (CPU). Motivated by this study, we are interested in researching as there are many modifications taking place in the conjugate gradient parameter. Therefore, in ...
Source
#2Srimazzura BasriH-Index: 2
Last. Mustafa MamatH-Index: 15
view all 3 authors...
The method of the nonlinear conjugate gradient is widely used in solving large-scale unconstrained optimization since been proven in solving optimization problems without using large memory storage. In this paper, we proposed a new modification of the Hestenes-Stiefel conjugate gradient parameter that fulfils the condition of sufficient descent using a strong Wolfe-Powell line search. Besides, the conjugate gradient method with the proposed conjugate gradient also guarantees low computation of i...
Source
#1Ibrahim Mohammed Sulaiman (UniSZA: Universiti Sultan Zainal Abidin)H-Index: 4
#2Mustafa Mamat (UniSZA: Universiti Sultan Zainal Abidin)H-Index: 15
Last. Muktar DanlamiH-Index: 1
view all 4 authors...
In this paper, we suggest a descent modification of the conjugate gradient method which converges globally provided that the exact minimization condition is satisfied. Preliminary numerical experiments on some benchmark problems show that the method is efficient and promising.
1 CitationsSource
#1Talat AlkhouliH-Index: 1
Last. Mohd RivaieH-Index: 7
view all 4 authors...
Source
#1M.R. MamatH-Index: 2
1 CitationsSource
#1Srimazzura Basri (UniSZA: Universiti Sultan Zainal Abidin)H-Index: 2
#2Mustafa Mamat (UTHM: Universiti Tun Hussein Onn Malaysia)
Last. Mustafa Mamat (UTHM: Universiti Tun Hussein Onn Malaysia)H-Index: 15
view all 2 authors...
Abstract Nonlinear conjugate gradient methods are widely used in solving large scale unconstrained optimization. Their wide application in many fields are due to their low memory. Numerous studies have been conducted recently to improve these methods. In this paper, a new class of conjugate gradient coefficients that possess global convergence properties is proposed. The global convergence result using exact line searches are discussed. Numerical result shows that the proposed method is more eff...
2 CitationsSource
#1Kamilu KamfaH-Index: 2
#2Mustafa MamatH-Index: 15
Last. Zabidin SallehH-Index: 8
view all 5 authors...
Source
#1Awad AbdelrahmanH-Index: 1
#2Mustafa MamatH-Index: 4
Last. Osman OmerH-Index: 2
view all 4 authors...
Nonlinear conjugate gradient (CG) methods are the most important method for solving large-scale unconstrained optimization problems. Many studies and modifications have been conducted recently to improve this method. In this paper, a new class of conjugate gradient coefficients (βk) with a new parameter m=‖gk‖/‖gk−1‖ that possess global convergence properties is presented. The global convergence and sufficient decent property result is established using inexact line searches to determine the (αk...
1 CitationsSource
#1Kamilu KamfaH-Index: 2
#2Mustafa MamatH-Index: 15
Last. Zabidin SallehH-Index: 8
view all 6 authors...
Conjugate gradient [CG] methods are considered in solving nonlinear unconstrained optimization problem, because of their simplicity, low memory requirement and global convergence properties. Different reviews and modification have been carried out in order to upgrade the method. In this paper, a new type of CG parameter, which satisfies the sufficient descent condition and global convergences property under exact line search, is proposed. The numerical outcomes indicate that our new modified par...
3 CitationsSource
Conjugate gradient (CG) methods are famous for solving nonlinear unconstrained optimization problems because they required low computational memory. In this paper, we propose a new conjugate gradient (βk ) which possesses global convergence properties using exact line search and inexact line search. The given method satisfies sufficient descent condition under strong Wolfe line search. Numerical results based on the number of iterations (NOI) and number of function (NOF), have shown that the new...