A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property

Published on May 1, 1999in Siam Journal on Optimization2.247
· DOI :10.1137/S1052623497318992
Yu-Hong Dai35
Estimated H-index: 35
,
Ya-xiang Yuan33
Estimated H-index: 33
Sources
Abstract
Conjugate gradient methods are widely used for unconstrained optimization, especially large scale problems. The strong Wolfe conditions are usually used in the analyses and implementations of conjugate gradient methods. This paper presents a new version of the conjugate gradient method, which converges globally, provided the line search satisfies the standard Wolfe conditions. The conditions on the objective function are also weak, being similar to those required by the Zoutendijk condition.
Download
📖 Papers frequently viewed together
1969
2 Authors (E. Polak, G. Ribiere)
566 Citations
177 Citations
621 Citations
References14
Newest
This paper investigates the global convergence properties of the Fletcher-Reeves (FR) method for unconstrained optimization. In a simple way, we prove that a kind of inexact line search condition can ensure the convergence of the FR method. Several examples are constructed to show that, if the search conditions are relaxed, the FR method may produce an ascent search direction, which implies that our result cannot be improved.
89 CitationsSource
#1Liu GuanghuiH-Index: 2
#2Han JiyeH-Index: 4
Last. Yin Hongxia (PKU: Peking University)H-Index: 1
view all 3 authors...
In this paper, we investigate the convergence properties of the Fletcher-Reeves algorithm. Under conditions weaker than those in a paper of M. Al-Baali, we get the global convergence of the Fletcher-Reeves algorithm with a low-accuracy inexact linesearch.
63 CitationsSource
11.3k Citations
#1Jean Charles Gilbert (IRIA: French Institute for Research in Computer Science and Automation)H-Index: 10
#2Jorge NocedalH-Index: 58
This paper explores the convergence of nonlinear conjugate gradient methods without restarts, and with practical line searches. The analysis covers two classes of methods that are globally convergent on smooth, nonconvex functions. Some properties of the Fletcher–Reeves method play an important role in the first family, whereas the second family shares an important property with the Polak–Ribiere method. Numerical experiments are presented.
727 CitationsSource
#1Roger FletcherH-Index: 38
Preface Table of Notation Part 1: Unconstrained Optimization Introduction Structure of Methods Newton-like Methods Conjugate Direction Methods Restricted Step Methods Sums of Squares and Nonlinear Equations Part 2: Constrained Optimization Introduction Linear Programming The Theory of Constrained Optimization Quadratic Programming General Linearly Constrained Optimization Nonlinear Programming Other Optimization Problems Non-Smooth Optimization References Subject Index.
7,861 Citations
#1Roger FletcherH-Index: 38
957 Citations
Si on utilise une recherche de ligne inexacte qui satisfait des conditions standards on peut demontrer que la methode de Fletcher-Reeves a une propriete de descente et est globalement convergente en un certain sens
342 CitationsSource
#1M. J. D. PowellH-Index: 58
We consider the global convergence of conjugate gradient methods without restarts, assuming exact arithmetic and exact line searches, when the objective function is twice continuously differentiable and has bounded level sets. Most of our attention is given to the Polak-Ribiere algorithm, and unfortunately we find examples that show that the calculated gradients can remain bounded away from zero. The examples that have only two variables show also that some variable metric algorithms for unconst...
280 CitationsSource
Some corrections and amplifications are appended to Convergence conditions for ascent methods, this Review, 11 (1969), pp. 226–235.
323 CitationsSource
#1G. ZoutendijkH-Index: 1
295 Citations
Cited By712
Newest
This work addresses the strictly convex unconstrained minimization problem via a modified version of the gradient method. The proposal is a line search method that uses a search direction based on the gradient method. This new direction is constructed by a mixture of the negative direction of the gradient with another particular direction that uses second-order information. Unlike Newton-type methods, our algorithm does not need to compute the inverse of the Hessian of the objective function. We...
Source
#1Zewen Li (CSUST: Changsha University of Science and Technology)H-Index: 5
#2Lizhi Mu (CSUST: Changsha University of Science and Technology)
Last. Junzhuo Zeng (CSUST: Changsha University of Science and Technology)
view all 6 authors...
Abstract One successful traveling wave protection is guaranteed by an accurate fault traveling wave detection. The traveling wave signals are normally detected through the Rogowski coil based traveling wave sensors. However, the secondary signal of the sensor is severely distorted. To handle this, a novel detection method of voltage traveling waves based on frequency division inversion is proposed. Firstly, the features of the traveling wave collected by the sensor are studied. Secondly, the var...
Source
#1Abdulkarim Hassan Ibrahim (King Mongkut's University of Technology Thonburi)H-Index: 6
#2Jitsupa Deepho (King Mongkut's University of Technology North Bangkok)H-Index: 7
Last. Abubakar Adamu (African Institute of Science and Technology)H-Index: 4
view all 4 authors...
Abstract null null In this paper, a derivative-free method for solving convex constrained nonlinear equations involving a monotone operator with a Lipschitz condition imposed on the underlying operator is introduced and studied. The proposed method incorporates the projection technique of Solodov and Svaiter with the three-term Polak-Ribiere-Polyak conjugate gradient method for the unconstrained optimization problem proposed by Min Li [J. Ind. Manag. Optim.16.1(2020): 245.16.1 (2020): 245]. Unde...
Source
#1Abubakar Sani Halilu (Bayero University Kano)H-Index: 1
#1Abubakar Sani Halilu (LPU: Lovely Professional University)H-Index: 4
Last. Kabiru Ahmed (Bayero University Kano)H-Index: 2
view all 4 authors...
Abstract In recent years there is a vast application of PCG method to restore the disturbed signals in compressive sensing. This research aims at developing a scheme, which is more effective for restoring disturbed signals than the popular PCG method (Liu and Li, 2015). To realize the desired goal, a new conjugate gradient approach combined with the projection scheme Solodov and Svaiter [Kluwer Academic Publishers, pp. 355-369(1998)] for solving monotone nonlinear equations with convex constrain...
2 CitationsSource
Multivariate binary data are increasingly frequent in practice. Although some adaptations of principal component analysis are used to reduce dimensionality for this kind of data, none of them provide a simultaneous representation of rows and columns (biplot). Recently, a technique named logistic biplot (LB) has been developed to represent the rows and columns of a binary data matrix simultaneously, even though the algorithm used to fit the parameters is too computationally demanding to be useful...
Source
#1Ahmad Alhawarat (UMT: Universiti Malaysia Terengganu)H-Index: 5
#2Ghaliah Alhamzi (Islamic University)
Last. Zabidin Salleh (UMT: Universiti Malaysia Terengganu)H-Index: 8
view all 0 authors...
Source
Last. Gonglin Yuan (Xida: Guangxi University)
view all 4 authors...
In this paper, we present a three-term conjugate gradient algorithm and three null approaches are used in the designed algorithm: (i) A modified weak Wolfe-Powell line search technique is introduced to obtain null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null null $...
Source
#1Behzad Azmi (Austrian Academy of Sciences)H-Index: 4
#2Karl Kunisch (Austrian Academy of Sciences)H-Index: 68
Source
#1Zemian Zhang (GDUT: Guangdong University of Technology)
#2Xuesong Chen (GDUT: Guangdong University of Technology)H-Index: 4
Abstract A conjugate gradient algorithm with strong Wolfe-Powell line search for distributed optimal control problem is proposed. The optimal system has been discussed in [1]. The proposed algorithm is employed to solve the problem in infinite dimensional function space. With low complexity, it is suitable for large-scale problem. The sufficient descent condition of conjugate gradient and the existence of iterative step are proved. The algorithm also has global convergence property and linear co...
Source
#1Jiang-hua Yin (GXUN: Guangxi University for Nationalities)
#2Jinbao Jian (GXUN: Guangxi University for Nationalities)H-Index: 18
Last. Xianzhen Jiang (GXUN: Guangxi University for Nationalities)H-Index: 2
view all 3 authors...
Abstract In this paper, by improving a line search criterion to yield steplength, and designing a hybrid conjugate parameter to construct sufficient descent direction, we propose a generalized hybrid conjugate gradient projection method for solving large-scale monotone nonlinear equations with convex constraints. For the proposed method, we prove its global convergence under some mild conditions, and study its convergence rate. Numerical comparisons with two existing methods show that our method...
Source