A New Perry Conjugate Gradient Method with the Generalized Conjugacy Condition

Published on Dec 30, 2010 in CI (Computational Intelligence)
· DOI :10.1109/CISE.2010.5677114
Dongyi Liu1
Estimated H-index: 1
(TJU: Tianjin University),
Yingfeng Shang1
Estimated H-index: 1
(TJU: Tianjin University)
Sources
Abstract
The Perry conjugate gradient method is generalized, from which a new descent algorithm, PDCGy, is presented and its global convergence is proven under Wolfe line searches. Preliminary numerical results for a set of 720 unconstrained optimization test problems verify the performance of the algorithm and show that the PDCGy algorithm is competitive with the CG_DESCENT algorithm.
📖 Papers frequently viewed together
2011CCDC: Chinese Control and Decision Conference
3 Authors (Haiyin Gao, ..., Tianxiao Zhu)
References17
Newest
In this work we present and analyze a new scaled conjugate gradient algorithm and its implementation, based on an interpretation of the secant equation and on the inexact Wolfe line search conditions. The best spectral conjugate gradient algorithm SCG by Birgin and Martinez (2001), which is mainly a scaled variant of Perry's (1977), is modified in such a manner to overcome the lack of positive definiteness of the matrix defining the search direction. This modification is based on the quasi-Newto...
Source
#1William W. Hager (UF: University of Florida)H-Index: 57
#2Hongchao Zhang (UF: University of Florida)H-Index: 25
Recently, a new nonlinear conjugate gradient scheme was developed which satisfies the descent condition gTkdk ≤ −7/8 VgkV2 and which is globally convergent whenever the line search fulfills the Wolfe conditions. This article studies the convergence behavior of the algorithm; extensive numerical tests and comparisons with other methods for large-scale unconstrained optimization are given.
Source
A new nonlinear conjugate gradient method and an associated implementation, based on an inexact line search, are proposed and analyzed. With exact line search, our method reduces to a nonlinear version of the Hestenes--Stiefel conjugate gradient scheme. For any (inexact) line search, our scheme satisfies the descent condition {\bf g}_k^{\sf T} {\bf d}_k \le -\frac{7}{8} \|{\bf g}_k\|^2 Moreover, a global convergence result is established when the line search fulfills the Wolfe conditions. A n...
Source
#1William W. Hager (UF: University of Florida)H-Index: 57
#2Hongchao Zhang (UF: University of Florida)H-Index: 25
This paper reviews the development of dierent versions of nonlinear conjugate gradient methods, with special attention given to global convergence properties.
#1Elizabeth D. Dolan (Argonne National Laboratory)H-Index: 3
#2Jorge J. Moré (Argonne National Laboratory)H-Index: 51
We propose performance profiles — distribution functions for a performance metric — as a tool for benchmarking and comparing optimization software. We show that performance profiles combine the best features of other tools for performance evaluation.
Source
#1Yu-Hong Dai (CAS: Chinese Academy of Sciences)H-Index: 37
#2Li-Zhi Liao (Hong Kong Baptist University)H-Index: 25
Conjugate gradient methods are a class of important methods for unconstrained optimization, especially when the dimension is large. This paper proposes a new conjugacy condition, which considers an inexact line search scheme but reduces to the old one if the line search is exact. Based on the new conjugacy condition, two nonlinear conjugate gradient methods are constructed. Convergence analysis for the two methods is provided. Our numerical results show that one of the methods is very efficient ...
Source
#1Ernesto G. Birgin (USP: University of São Paulo)H-Index: 34
#2José Mario Martínez (State University of Campinas)H-Index: 59
A family of scaled conjugate gradient algorithms for large-scale unconstrained minimization is defined. The Perry, the Polak—Ribiere and the Fletcher—Reeves formulae are compared using a spectral scaling derived from Raydan's spectral gradient optimization method. The best combination of formula, scaling and initial choice of step-length is compared against well known algorithms using a classical set of problems. An additional comparison involving an ill-conditioned estimation problem in Optics ...
Source
Conjugate gradient methods are widely used for unconstrained optimization, especially large scale problems. The strong Wolfe conditions are usually used in the analyses and implementations of conjugate gradient methods. This paper presents a new version of the conjugate gradient method, which converges globally, provided the line search satisfies the standard Wolfe conditions. The conditions on the objective function are also weak, being similar to those required by the Zoutendijk condition.
Source
#1Jean Charles Gilbert (IRIA: French Institute for Research in Computer Science and Automation)H-Index: 10
#2Jorge NocedalH-Index: 59
This paper explores the convergence of nonlinear conjugate gradient methods without restarts, and with practical line searches. The analysis covers two classes of methods that are globally convergent on smooth, nonconvex functions. Some properties of the Fletcher–Reeves method play an important role in the first family, whereas the second family shares an important property with the Polak–Ribiere method. Numerical experiments are presented.
Source
In this paper, we describe an implementation and give performance results for a conjugate gradient algorithm for unconstrained optimization. The algorithm is based upon the Nazareth three-term formula and incorporates Allwright preconditioning matrices and restart tests. The performance results for this combination compare favorably with existing codes.
Source
Cited By5
Newest
#2Kabiru AhmedH-Index: 5
Last. Abubakar Sani Halilu (Sule Lamido University)H-Index: 5
view all 4 authors...
In this paper, we propose two conjugate gradient methods for solving large-scale monotone nonlinear equations. The methods are developed by combining the hyperplane projection method by Solodov and Svaiter (Reformulation: nonsmooth, piecewise smooth, semismooth and smoothing methods. Springer, pp 355–369, 1998) and two modified search directions of the famous Dai and Liao (Appl Math Optim 43(1): 87–101, 2001) method. It is shown that the proposed schemes satisfy the sufficient descent condition....
Source
In this paper, we present a family of Perry conjugate gradient methods for solving large-scale systems of monotone nonlinear equations. The methods are developed by combining modified versions of Perry (Oper. Res. Tech. Notes 26(6), 1073–1078, 1978) conjugate gradient method with the hyperplane projection technique of Solodov and Svaiter (1998). Global convergence and numerical results of the methods are established and preliminary numerical results shows that the proposed methods are promising ...
Source
#1Mohammed Yusuf Waziri (Bayero University Kano)H-Index: 8
#2Kabiru Ahmed (Bayero University Kano)H-Index: 5
Last. Jamilu Sabi’u (Northwest University (United States))H-Index: 8
view all 3 authors...
In this paper, we propose a Dai–Liao (DL) conjugate gradient method for solving large-scale system of nonlinear equations. The method incorporates an extended secant equation developed from modified secant equations proposed by Zhang et al. (J Optim Theory Appl 102(1):147–157, 1999) and Wei et al. (Appl Math Comput 175(2):1156–1188, 2006) in the DL approach. It is shown that the proposed scheme satisfies the sufficient descent condition. The global convergence of the method is established under ...
Source
#1Dongyi Liu (TJU: Tianjin University)H-Index: 5
#2Liping Zhang (TJU: Tianjin University)H-Index: 2
Last. Genqi Xu (TJU: Tianjin University)H-Index: 21
view all 3 authors...
A new method used to prove the global convergence of the nonlinear conjugate gradient methods, the spectral method, is presented in this paper, and it is applied to a new conjugate gradient algorithm with sufficiently descent property. By analyzing the descent property, several concrete forms of this algorithm are suggested. Under standard Wolfe line searches, the global convergence of the new algorithm is proven for nonconvex functions. Preliminary numerical results for a set of 720 unconstrain...
Source
#1Dongyi Liu (TJU: Tianjin University)H-Index: 5
#2Genqi Xu (TJU: Tianjin University)H-Index: 21
A family of new conjugate gradient methods is proposed based on Perry's idea, which satisfies the descent property or the sufficient descent property for any line search. In addition, based on the scaling technology and the restarting strategy, a family of scaling symmetric Perry conjugate gradient methods with restarting procedures is presented. The memoryless BFGS method and the SCALCG method are the special forms of the two families of new methods, respectively. Moreover, several concrete new...
Source
This website uses cookies.
We use cookies to improve your online experience. By continuing to use our website we assume you agree to the placement of these cookies.
To learn more, you can find in our Privacy Policy.