A modified Liu-Storey conjugate gradient method and its global convergence for unconstrained optimization

Published on May 26, 2010 in CCDC (Chinese Control and Decision Conference)
路 DOI :10.1109/CCDC.2010.5498291
Fujian Duan1
Estimated H-index: 1
(GUET: Guilin University of Electronic Technology),
Zhongbo Sun1
Estimated H-index: 1
(GUET: Guilin University of Electronic Technology)
Sources
Abstract
In this paper, a sufficient descent conjugate gradient method is proposed for solving unconstrained optimization problems and a new sufficient descent search direction is proposed. Similarly, this method can be generalized to other classical conjugate gradient methods. The theoretical analysis shows that the algorithm is global convergence under some suitable conditions. Numerical results show that this new modified algorithm is effective in unconstrained optimization problems.
馃摉 Papers frequently viewed together
2012CCDC: Chinese Control and Decision Conference
3 Authors (Zhongbo Sun, ..., Haiyin Gao)
2012CCC: Chinese Control Conference
4 Authors (Zhongbo Sun, ..., Haiyin Gao)
2011CCDC: Chinese Control and Decision Conference
3 Authors (Haiyin Gao, ..., Tianxiao Zhu)
References6
Newest
#1Jorge Nocedal (NU: Northwestern University)H-Index: 59
#2Stephen J. Wright (UW: University of Wisconsin-Madison)H-Index: 66
Numerical Optimization presents a comprehensive and up-to-date description of the most effective methods in continuous optimization. It responds to the growing interest in optimization in engineering, science, and business by focusing on the methods that are best suited to practical problems. For this new edition the book has been thoroughly updated throughout. There are new chapters on nonlinear interior methods and derivative-free methods for optimization, both of which are used widely in prac...
#1Zeng Xin Wei (Xida: Guangxi University)H-Index: 1
#2Guoyin Li (UNSW: University of New South Wales)H-Index: 30
Last. Liqun Qi (PolyU: Hong Kong Polytechnic University)H-Index: 75
view all 3 authors...
We propose two algorithms for nonconvex unconstrained optimization problems that employ Polak-Ribiere-Polyak conjugate gradient formula and new inexact line search techniques. We show that the new algorithms converge globally if the function to be minimized has Lipschitz continuous gradients. Preliminary numerical results show that the proposed methods for particularly chosen line search conditions are very promising.
Source
#1Li ZhangH-Index: 3
#2Weijun ZhouH-Index: 5
Last. Dong-Hui LiH-Index: 17
view all 3 authors...
In this paper, we propose a modified Polak-Ribiere-Polyak (PRP) conjugate gradient method. An attractive property of the proposed method is that the direction generated by the method is always a descent direction for the objective 'function. This property is independent of the line search used. Moreover, if exact line search is used, the method reduces to the ordinary PRP method. Under appropriate conditions, we show that the modified PRP method with Armijo-type line search is globally convergen...
Source
#1Li Zhang (CSUST: Changsha University of Science and Technology)H-Index: 3
#2Weijun Zhou (CSUST: Changsha University of Science and Technology)H-Index: 5
Last. Dong-Hui Li (Hunan University)H-Index: 17
view all 3 authors...
In this paper, we are concerned with the conjugate gradient methods for solving unconstrained optimization problems. It is well-known that the direction generated by a conjugate gradient method may not be a descent direction of the objective function. In this paper, we take a little modification to the Fletcher鈥揜eeves (FR) method such that the direction generated by the modified method provides a descent direction for the objective function. This property depends neither on the line search used,...
Source
A new nonlinear conjugate gradient method and an associated implementation, based on an inexact line search, are proposed and analyzed. With exact line search, our method reduces to a nonlinear version of the Hestenes--Stiefel conjugate gradient scheme. For any (inexact) line search, our scheme satisfies the descent condition {\bf g}_k^{\sf T} {\bf d}_k \le -\frac{7}{8} \|{\bf g}_k\|^2 Moreover, a global convergence result is established when the line search fulfills the Wolfe conditions. A n...
Source
#1Jorge J. Mor茅 (Argonne National Laboratory)H-Index: 51
#2Burton S. Garbow (Argonne National Laboratory)H-Index: 10
Last. Kenneth E. Hillstrom (Argonne National Laboratory)H-Index: 6
view all 3 authors...
Much of the testing of optimization software is inadequate because the number of test functmns is small or the starting points are close to the solution. In addition, there has been too much emphasm on measurmg the efficmncy of the software and not enough on testing reliability and robustness. To address this need, we have produced a relatwely large but easy-to-use collection of test functions and designed gmdelines for testing the reliability and robustness of unconstrained optimization softwar...
Source
Cited By5
Newest
Source
May 25, 2013 in CCDC (Chinese Control and Decision Conference)
#1Zhongbo Sun (Northeast Normal University)H-Index: 2
#2Chunling Xu (Northeast Normal University)H-Index: 1
Last. Haiyin Gao (Changchun University)H-Index: 2
view all 3 authors...
It is well-known that the Dai-Yuan conjugate gradient method. Recently, Zhang developed two modified Dai-Yuan (MDY) methods that are globally convergence if the standard Armijo line search is used. In this paper, firstly, we investigate the R-convergence rate of the MDY method with inexact Armijo line search. Secondly, We show another MVDY method convergence globally for nonconvex minimization problems. Thirdly, the MVDY method also have R-convergence rate with inexact Armijo line search. Numeri...
Source
May 25, 2013 in CCDC (Chinese Control and Decision Conference)
#1Ping Ren (Shenyang University)H-Index: 1
#2Liqun Gao (NU: Northeastern University)H-Index: 20
Last. Nan Li (Shenyang University)H-Index: 2
view all 3 authors...
In this paper, the nonlinear optimal control problem is formulated as a multi-objective mathematical optimization problem. Harmony search (HS) algorithm is one of the new heuristic algorithms. The HS optimization algorithm is introduced for the first time in solving the fault section estimation performance in power systems. A case on optimal estimation for fault section in the part of the 230KV Southern Brazilian electric power system is presented to show the methodology's feasibility and effici...
Source
May 23, 2012 in CCDC (Chinese Control and Decision Conference)
#1Zhongbo Sun (Northeast Normal University)H-Index: 2
#2Tianxiao Zhu (Changchun University)H-Index: 2
Last. Haiyin Gao (Changchun University)H-Index: 2
view all 3 authors...
In this paper, a modified descent HS conjugate gradient method is proposed for solving unconstrained optimization problems and a new sufficient descent direction is proposed. Under some suitable conditions, theoretical analysis shows that the algorithm is global convergence. Numerical results show that this method is effective in unconstrained minimizing optimization problems.
Source
May 23, 2011 in CCDC (Chinese Control and Decision Conference)
#1Haiyin Gao (Changchun University)H-Index: 2
#2Zhongbo Sun (Northeast Normal University)H-Index: 2
Last. Tianxiao Zhu (Changchun University)H-Index: 2
view all 3 authors...
In this paper, a hybrid conjugate gradient method is proposed for solving unconstrained optimization problems. The parameter 尾 k is computed as a convex combination of 尾 k PRP and 尾 k 鈭 algorithms. The parameter 胃 k is computed in such a way so that the direction corresponding to the conjugate gradient algorithm to be the Quasi-Newton equation. It is sufficient descent at every iteration. The theoretical analysis shows that the algorithm is global convergence under some suitable conditions. Nume...
Source
This website uses cookies.
We use cookies to improve your online experience. By continuing to use our website we assume you agree to the placement of these cookies.
To learn more, you can find in our Privacy Policy.