A New Modification of Nonlinear Conjugate Gradient Coefficients with Global Convergence Properties

Published on Aug 11, 2015
Ahmad Alhawarat5
Estimated H-index: 5
,
Mustafa Mamat15
Estimated H-index: 15
+ 1 AuthorsIsmail Mohd8
Estimated H-index: 8
Source
Abstract
Conjugate Gradient (CG) method has been enormously used to solve large scale unconstrained optimization problems due to the number of iteration, memory, CPU time, and convergence property, in this paper we proposed a new class of nonlinear conjugate gradient coefficient with global convergence properties proved by exact line search. The numerical results for our CG method new present an efficient numerical result when it compared with well-known formulas. Keywords—Conjugate gradient method, conjugate gradient coefficient, global convergence.
📖 Papers frequently viewed together
2012
4 Authors (Mohd Rivaie, ..., Ismail Mohd)
2 Citations
2012
3 Authors (Mohd Rivaie, ..., Mustafa Mamat)
1 Citations
6 Citations
References12
Newest
#1Zhifeng Dai (Hunan University)H-Index: 10
#2Bo-Shi Tian (CSUST: Changsha University of Science and Technology)H-Index: 1
Recently, similar to Hager and Zhang (SIAM J Optim 16:170–192, 2005), Yu (Nonlinear self-scaling conjugate gradient methods for large-scale optimization problems. Thesis of Doctors Degree, Sun Yat-Sen University, 2007) and Yuan (Optim Lett 3:11–21, 2009) proposed modified PRP conjugate gradient methods which generate sufficient descent directions without any line searches. In order to obtain the global convergence of their algorithms, they need the assumption that the stepsize is bounded away fr...
25 CitationsSource
Abstract Based on the modified secant equation, we propose two new HS type conjugate gradient formulas. Their forms are similar to the original HS conjugate gradient formula and inherit all nice properties of the HS method. By utilizing the technique of the three-term HS method in Zhang et al. (2007) [15] , without the requirement of truncation and convexity of the objective function, we show that one with Wolfe line search and the other with Armijo line search are globally convergent. Moreover,...
23 CitationsSource
#1Li Zhang (CSUST: Changsha University of Science and Technology)H-Index: 3
In this paper, we take a little modification to the Wei-Yao-Liu nonlinear conjugate gradient method proposed by Wei et al. [Z. Wei, S. Yao, L. Liu, The convergence properties of some new conjugate gradient methods, Appl. Math. Comput. 183 (2006) 1341-1350] such that the modified method possesses better convergence properties. In fact, we prove that the modified method satisfies sufficient descent condition with greater parameter @[email protected]?0,12 in the strong Wolfe line search and converg...
36 CitationsSource
#1Gonglin Yuan (Xida: Guangxi University)H-Index: 8
#2Xiwen Lu (ECUST: East China University of Science and Technology)H-Index: 3
Last. Zengxin Wei (Xida: Guangxi University)H-Index: 22
view all 3 authors...
A modified conjugate gradient method is presented for solving unconstrained optimization problems, which possesses the following properties: (i) The sufficient descent property is satisfied without any line search; (ii) The search direction will be in a trust region automatically; (iii) The Zoutendijk condition holds for the Wolfe-Powell line search technique; (iv) This method inherits an important property of the well-known Polak-Ribiere-Polyak (PRP) method: the tendency to turn towards the ste...
69 CitationsSource
In this paper, by the use of the project of the PRP (Polak–Ribiere–Polyak) conjugate gradient direction, we develop a PRP-based descent method for solving unconstrained optimization problem. The method provides a sufficient descent direction for the objective function. Moreover, if exact line search is used, the method reduces to the standard PRP method. Under suitable conditions, we show that the method with some backtracking line search or the generalized Wolfe-type line search is globally con...
73 CitationsSource
#1Li Zhang (CSUST: Changsha University of Science and Technology)H-Index: 3
#2Weijun Zhou (Hunan University)H-Index: 5
Last. Dong-Hui Li (Hunan University)H-Index: 17
view all 3 authors...
In this paper, we propose a three-term conjugate gradient method which can produce sufficient descent condition, that is, [image omitted] . This property is independent of any line search used. When an exact line search is used, this method reduces to the standard Hestenes-Stiefel conjugate gradient method. We also introduce two variants of the proposed method which still preserve the sufficient descent property, and prove that these two methods converge globally with standard Wolfe line search ...
91 CitationsSource
#1Zengxin Wei (Xida: Guangxi University)H-Index: 22
#2Shengwei Yao (Xida: Guangxi University)H-Index: 2
Last. Liying Liu (Liaocheng University)H-Index: 3
view all 3 authors...
Abstract In this paper, a new conjugate gradient formula β k ∗ is given to compute the search directions for unconstrained optimization problems. General convergence results for the proposed formula with some line searches such as the exact line search, the Wolfe–Powell line search and the Grippo–Lucidi line search are discussed. Under the above line searches and some assumptions, the global convergence properties of the given methods are discussed. The given formula β k ∗ ⩾ 0 , and has the simi...
126 CitationsSource
#1Li ZhangH-Index: 3
#2Weijun ZhouH-Index: 5
Last. Dong-Hui LiH-Index: 17
view all 3 authors...
In this paper, we propose a modified Polak-Ribiere-Polyak (PRP) conjugate gradient method. An attractive property of the proposed method is that the direction generated by the method is always a descent direction for the objective 'function. This property is independent of the line search used. Moreover, if exact line search is used, the method reduces to the ordinary PRP method. Under appropriate conditions, we show that the modified PRP method with Armijo-type line search is globally convergen...
300 CitationsSource
#1Y. Liu (Lboro: Loughborough University)H-Index: 1
#2C. Storey (Lboro: Loughborough University)H-Index: 7
The effect of inexact line search on conjugacy is studied in unconstrained optimization. A generalized conjugate gradient method based on this effect is proposed and shown to have global convergence for a twice continuously differentiable function with a bounded level set.
307 CitationsSource
#1T. M. WilliamsH-Index: 1
677 Citations
Cited By2
Newest
#1Zabidin Salleh (UMT: Universiti Malaysia Terengganu)H-Index: 8
#2Ahmad Alhawarat (UMT: Universiti Malaysia Terengganu)H-Index: 5
The conjugate gradient (CG) method is one of the most popular methods to solve nonlinear unconstrained optimization problems. The Hestenes-Stiefel (HS) CG formula is considered one of the most efficient methods developed in this century. In addition, the HS coefficient is related to the conjugacy condition regardless of the line search method used. However, the HS parameter may not satisfy the global convergence properties of the CG method with the Wolfe-Powell line search if the descent conditi...
7 CitationsSource
Conjugate gradient (CG) method is an interesting tool to solve optimization problems in many fields, such as design, economics, physics, and engineering. In this paper, we depict a new hybrid of CG method which relates to the famous Polak-Ribiere-Polyak (PRP) formula. It reveals a solution for the PRP case which is not globally convergent with the strong Wolfe-Powell (SWP) line search. The new formula possesses the sufficient descent condition and the global convergent properties. In addition, w...
8 CitationsSource