A simple three-term conjugate gradient algorithm for unconstrained optimization

Published on Mar 1, 2013in Journal of Computational and Applied Mathematics2.621
路 DOI :10.1016/J.CAM.2012.10.002
Neculai Andrei21
Estimated H-index: 21
Sources
Abstract
A simple three-term conjugate gradient algorithm which satisfies both the descent condition and the conjugacy condition is presented. This algorithm is a modification of the Hestenes and Stiefel algorithm (Hestenes and Stiefel, 1952) [10], or that of Hager and Zhang (Hager and Zhang, 2005) [23] in such a way that the search direction is descent and it satisfies the conjugacy condition. These properties are independent of the line search. Also, the algorithm could be considered as a modification of the memoryless BFGS quasi-Newton method. The new approximation of the minimum is obtained by the general Wolfe line search, now using a standard acceleration technique developed by Andrei (2009) [27]. For uniformly convex functions, under standard assumptions, the global convergence of the algorithm is proved. Numerical comparisons of the suggested three-term conjugate gradient algorithm versus six other three-term conjugate gradient algorithms, using a set of 750 unconstrained optimization problems, show that all these computational schemes have similar performances, the suggested one being slightly faster and more robust. The proposed three-term conjugate gradient algorithm substantially outperforms the well-known Hestenes and Stiefel conjugate gradient algorithm, as well as the more elaborate CG_DESCENT algorithm. Using five applications from the MINPACK-2 test problem collection (Averick et al., 1992) [25], with 10^6 variables, we show that the suggested three-term conjugate gradient algorithm is the top performer versus CG_DESCENT.
馃摉 Papers frequently viewed together
References37
Newest
#1Li ZhangH-Index: 1
#2Youhua ZhouH-Index: 1
Recently, Zhang, Zhou and Li [Optim. Methods Softw., 22 (2007), pp. 697-711] proposed a three-term Hestenes-Stiefel (THS) nonlinear conjugate gradient method for optimization and proved its global convergence for strongly convex functions. In this note we further investigate the convergence properties of the THS method on convex optimization. We show that the THS method converges globally for convex functions with the strong Wolfe line search and obtain its R-linear convergence rate under suitab...
A modified Polak鈥揜ibiere鈥揚olyak conjugate gradient algorithm which satisfies both the sufficient descent condition and the conjugacy condition is presented. These properties are independent of the line search. The algorithms use the standard Wolfe line search. Under standard assumptions, we show the global convergence of the algorithm. Numerical comparisons with conjugate gradient algorithms using a set of 750 unconstrained optimization problems, some of them from the CUTE library, show that thi...
Source
#1Yasushi NarushimaH-Index: 9
#2Hiroshi YabeH-Index: 16
Last. John A. FordH-Index: 11
view all 3 authors...
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems because they do not need the storage of matrices. In this paper, we propose a general form of three-term conjugate gradient methods which always generate a sufficient descent direction. We give a sufficient condition for the global convergence of the proposed method. Moreover, we present a specific three-term conjugate gradient method based on the multistep quasi-Newton method. Finally, some nu...
Source
#1Jianguo Zhang (Henan University)H-Index: 1
#2Yunhai XiaoH-Index: 14
Last. Zengxin WeiH-Index: 21
view all 3 authors...
Two nonlinear conjugate gradient-type methods for solving unconstrained optimization problems are proposed. An attractive property of the methods, is that, without any line search, the generated directions always descend. Under some mild conditions, global convergence results for both methods are established. Preliminary numerical results show that these proposed methods are promising, and competitive with the well-known PRP method.
Source
Conjugate gradient methods are important for large-scale unconstrained optimization. This paper proposes an acceleration of these methods using a modification of steplength. The idea is to modify in a multiplicative manner the steplength @a"k, computed by Wolfe line search conditions, by means of a positive parameter @h"k, in such a way to improve the behavior of the classical conjugate gradient algorithms. It is shown that for uniformly convex functions the convergence of the accelerated algori...
Source
#1Roger FletcherH-Index: 34
Preface Table of Notation Part 1: Unconstrained Optimization Introduction Structure of Methods Newton-like Methods Conjugate Direction Methods Restricted Step Methods Sums of Squares and Nonlinear Equations Part 2: Constrained Optimization Introduction Linear Programming The Theory of Constrained Optimization Quadratic Programming General Linearly Constrained Optimization Nonlinear Programming Other Optimization Problems Non-Smooth Optimization References Subject Index.
#1Neculai AndreiH-Index: 21
A collection of unconstrained optimization test functions is presented. The purpose of this collection is to give to the optimization community a large number of general test functions to be used in testing the unconstrained optimization algorithms and comparisons studies. For each function we give its algebraic expression and the standard initial point. Some of the test fnctions are from the CUTE collection established by Bongartz, Conn, Gould and Toint, (1995), others are from More, Garbow and...
In this paper, by the use of the project of the PRP (Polak鈥揜ibiere鈥揚olyak) conjugate gradient direction, we develop a PRP-based descent method for solving unconstrained optimization problem. The method provides a sufficient descent direction for the objective function. Moreover, if exact line search is used, the method reduces to the standard PRP method. Under suitable conditions, we show that the method with some backtracking line search or the generalized Wolfe-type line search is globally con...
Source
In this work we present and analyze a new scaled conjugate gradient algorithm and its implementation, based on an interpretation of the secant equation and on the inexact Wolfe line search conditions. The best spectral conjugate gradient algorithm SCG by Birgin and Martinez (2001), which is mainly a scaled variant of Perry's (1977), is modified in such a manner to overcome the lack of positive definiteness of the matrix defining the search direction. This modification is based on the quasi-Newto...
Source
#1Li Zhang (CSUST: Changsha University of Science and Technology)H-Index: 3
#2Weijun Zhou (Hunan University)H-Index: 5
Last. Dong-Hui Li (Hunan University)H-Index: 17
view all 3 authors...
In this paper, we propose a three-term conjugate gradient method which can produce sufficient descent condition, that is, [image omitted]聽. This property is independent of any line search used. When an exact line search is used, this method reduces to the standard Hestenes-Stiefel conjugate gradient method. We also introduce two variants of the proposed method which still preserve the sufficient descent property, and prove that these two methods converge globally with standard Wolfe line search ...
Source
Cited By46
Newest
#1Ahmad Alhawarat (UMT: Universiti Malaysia Terengganu)H-Index: 5
#2Ghaliah Alhamzi (Islamic University)
Last. Zabidin Salleh (UMT: Universiti Malaysia Terengganu)H-Index: 11
view all 0 authors...
Source
#1Xiaoliang Dong (MUC: Minzu University of China)H-Index: 9
#2Zhifeng Dai (CSUST: Changsha University of Science and Technology)H-Index: 12
Last. Xiangli Li (GUET: Guilin University of Electronic Technology)H-Index: 6
view all 4 authors...
In this paper, an adaptive three-term conjugate gradient method is proposed for solving unconstrained problems, which generates sufficient descent directions at each iteration. Different from the existent methods, a dynamical adjustment between Hestenes鈥揝tiefel and Dai鈥揕iao conjugacy conditions in our proposed method is developed. Under mild condition, we show that the proposed method converges globally. Numerical experimentation with the new method indicates that it efficiently solves the test ...
Source
#1Auwal Bala Abubakar (Sefako Makgatho Health Sciences University)H-Index: 13
#2Poom Kumam (China Medical University (Taiwan))H-Index: 48
Last. Abdulkarim Hassan Ibrahim (King Mongkut's University of Technology Thonburi)H-Index: 12
view all 4 authors...
Abstract null null In this article, we propose a hybrid conjugate gradient (CG) scheme for solving unconstrained optimization problem. The search direction is a combination of the Polak鈥揜ibiere鈥揚olyak (PRP) and the Liu-Storey (LS) CG parameters and is close to the direction of the memoryless Broyden鈥揊letcher鈥揋oldfarb鈥揝hanno (BFGS) quasi-Newton scheme. Without the use of the line search, the search direction satisfies the descent condition and possesses the trust region property. The global conve...
Source
Three-term conjugate gradient methods have attracted much attention for large-scale unconstrained problems in recent years, since they have attractive practical factors such as simple computation, low memory requirement, better descent property and strong global convergence property. In this paper, a hybrid three-term conjugate gradient algorithm is proposed and it owns a sufficient descent property, independent of any line search technique. Under some mild conditions, the proposed method is glo...
Source
#1Jie Guo (CSU: Central South University)
#2Zhong Wan (CSU: Central South University)H-Index: 13
A new spectral three-term conjugate gradient algorithm in virtue of the Quasi-Newton equation is developed for solving large-scale unconstrained optimization problems. It is proved that the search ...
Source
#1Shengwei Yao (Xida: Guangxi University)H-Index: 1
#2Yuping Wu (Xida: Guangxi University)H-Index: 1
Last. Jieqiong Xu (Xida: Guangxi University)H-Index: 1
view all 4 authors...
We proposed a three-term gradient descent method that can be well applied to address the optimization problems in this article. The search direction of the obtained method is generated in a specific subspace. Specifically, a quadratic approximation model is applied in the process of generating the search direction. In order to reduce the amount of calculation and make the best use of the existing information, the subspace was made up of the gradient of the current and prior iteration point and t...
Source
Source
#2Poom Kumam (King Mongkut's University of Technology Thonburi)H-Index: 48
#3Maulana MalikH-Index: 4
Last. Abdulkarim Hassan IbrahimH-Index: 12
view all 5 authors...
In this paper, we present a new hybrid conjugate gradient (CG) approach for solving unconstrained optimization problem. The search direction is a hybrid form of the Fletcher-Reeves (FR) and the Dai-Yuan (DY) CG parameters and is close to the direction of the memoryless Broyden-Fletcher-Goldfarb-Shanno (BFGS) quasi-Newton approach. Independent of the line search, the search direction of the new approach satisfies the descent condition and possess the trust region. We establish the global converge...
Source
#1Xiaoliang Dong (Nanjing Normal University)H-Index: 9
In this paper, a small and necessary revision on an assumption condition of Aminifard and Babaie-Kafaki (Calcolo, 2019. https://doi.org/10.1007/s10092-019-0312-9 ) is made. By a little modification, a new conjugate gradient method is proposed, in which the search directions satisfy the sufficient descent condition with the strong Wolfe line search. The main difference between two algorithms is that the proposed method is globally convergent without boundedness assumption on the steplength. Compa...
Source
#1Ibrahim Mohammed Sulaiman (UniSZA: Universiti Sultan Zainal Abidin)H-Index: 6
#2Mustafa Mamat (UniSZA: Universiti Sultan Zainal Abidin)H-Index: 16
Last. Maulana MalikH-Index: 4
view all 6 authors...
Source
This website uses cookies.
We use cookies to improve your online experience. By continuing to use our website we assume you agree to the placement of these cookies.
To learn more, you can find in our Privacy Policy.