New modifications of conjugate gradient coefficient with global convergence properties

Published on Jun 24, 2012
· DOI :10.1109/SHUSER.2012.6268897
Mohd Rivaie7
Estimated H-index: 7
(UiTM: Universiti Teknologi MARA),
Muhammad Fauzi3
Estimated H-index: 3
(UiTM: Universiti Teknologi MARA),
Mustafa Mamat15
Estimated H-index: 15
(UMT: Universiti Malaysia Terengganu)
Conjugate gradient (CG) methods have played an important role in solving unconstrained optimization due to its simplicity and global convergence properties. In this paper, two new modifications of conjugate gradient coefficient (β k ) with global convergence properties are presented. The global convergence result is established using exact line searches. Comparisons are made between six others well known CG coefficient. Preliminary result by performance profile shows that the proposed formula is competitive when compared to the other CG coefficients
📖 Papers frequently viewed together
4 Authors (Mohd Rivaie, ..., Ismail Mohd)
2 Citations
4 Authors (Ahmad Alhawarat, ..., Ismail Mohd)
2 Citations
2 Authors (Dai, Yh)
9 Citations
#2Neculai AndreiH-Index: 20
The paper presents some open problems associated to the nonlin- ear conjugate gradient algorithms for unconstrained optimization. Mainly, these problems refer to the initial direction, the conjugacy condition, the step length computation, new formula for conjugate gradient parameter computation based on function's values, the inuence of accuracy of line search procedure, how we can take the problem's structure on conjugate gradient algorithms, how we can consider the second order information in ...
45 Citations
#1Gonglin Yuan (Xida: Guangxi University)H-Index: 3
#2Sha LuH-Index: 3
Last. Zengxin WeiH-Index: 1
view all 3 authors...
It is well known that the line search methods play a very important role for optimization problems. In this paper a new line search method is proposed for solving unconstrained optimization. Under weak conditions, this method possesses global convergence and R-linear convergence for nonconvex function and convex function, respectively. Moreover, the given search direction has sufficiently descent property and belongs to a trust region without carrying out any line search rule. Numerical results ...
15 CitationsSource
In this paper we propose a fundamentally different conjugate gradient method, in which the well-known parameter @b"k is computed by an approximation of the Hessian/vector product through finite differences. For search direction computation, the method uses a forward difference approximation to the Hessian/vector product in combination with a careful choice of the finite difference interval. For the step length computation we suggest an acceleration scheme able to improve the efficiency of the al...
30 CitationsSource
#1Gonglin Yuan (Xida: Guangxi University)H-Index: 1
#2Zengxin Wei (Xida: Guangxi University)H-Index: 1
Abstract It is well known that the search direction plays a main role in the line search method. In this paper, we propose a new search direction together with the Wolfe line search technique and one nonmonotone line search technique for solving unconstrained optimization problems. The given methods possess sufficiently descent property without carrying out any line search rule. The convergent results are established under suitable conditions. For numerical results, analysis of one probability s...
52 CitationsSource
#1Zhen-Jun Shi (UM: University of Michigan)H-Index: 16
#2Jinhua Guo (UM: University of Michigan)H-Index: 16
In this paper we develop a new class of conjugate gradient methods for unconstrained optimization problems. A new nonmonotone line search technique is proposed to guarantee the global convergence of these conjugate gradient methods under some mild conditions. In particular, Polak-Ribiere-Polyak and Liu-Storey conjugate gradient methods are special cases of the new class of conjugate gradient methods. By estimating the local Lipschitz constant of the derivative of objective functions, we can find...
30 CitationsSource
#1Neculai AndreiH-Index: 20
A collection of unconstrained optimization test functions is presented. The purpose of this collection is to give to the optimization community a large number of general test functions to be used in testing the unconstrained optimization algorithms and comparisons studies. For each function we give its algebraic expression and the standard initial point. Some of the test fnctions are from the CUTE collection established by Bongartz, Conn, Gould and Toint, (1995), others are from More, Garbow and...
304 Citations
A new nonlinear conjugate gradient method and an associated implementation, based on an inexact line search, are proposed and analyzed. With exact line search, our method reduces to a nonlinear version of the Hestenes--Stiefel conjugate gradient scheme. For any (inexact) line search, our scheme satisfies the descent condition {\bf g}_k^{\sf T} {\bf d}_k \le -\frac{7}{8} \|{\bf g}_k\|^2 Moreover, a global convergence result is established when the line search fulfills the Wolfe conditions. A n...
686 CitationsSource
#1DaiH-Index: 1
#2Yu-hongH-Index: 1
Last. Ya-xiangH-Index: 1
view all 4 authors...
The conjugate gradient method for unconstrained optimization problems varies with a scalar. In this note, a general condition concerning the scalar is given, which ensures the global convergence of the method in the case of strong Wolfe line searches. It is also discussed how to use the result to obtain the convergence of the famous Fletcher-Reeves, and Polak-Ribiere-Polyak conjugate gradient methods. That the condition cannot be relaxed in some sense is mentioned.
17 Citations
#1Elizabeth D. Dolan (Argonne National Laboratory)H-Index: 3
#2Jorge J. Moré (Argonne National Laboratory)H-Index: 50
We propose performance profiles — distribution functions for a performance metric — as a tool for benchmarking and comparing optimization software. We show that performance profiles combine the best features of other tools for performance evaluation.
2,657 CitationsSource
#1Jie Sun (NUS: National University of Singapore)H-Index: 87
#2Jiapu Zhang (University of Melbourne)H-Index: 13
Global convergence results are derived for well-known conjugate gradient methods in which the line search step is replaced by a step whose length is determined by a formula. The results include the following cases: (1) The Fletcher–Reeves method, the Hestenes–Stiefel method, and the Dai–Yuan method applied to a strongly convex LC1 objective function; (2) The Polak–Ribiere method and the Conjugate Descent method applied to a general, not necessarily convex, LC1 objective function.
99 CitationsSource
Cited By1
#1Mohamed HamodaH-Index: 3
#2Mustafa MamatH-Index: 15
Last. Zabidin SallehH-Index: 8
view all 4 authors...
In this paper, a modified conjugate gradient method is presented for solving large-scale unconstrained optimization problems, which possesses the sufficient descent property with Strong Wolfe-Powell line search. A global convergence result was proved when the (SWP) line search was used under some conditions. Computational results for a set consisting of 138 unconstrained optimization test problems showed that this new conjugate gradient algorithm seems to converge more stable and is superior to ...
8 CitationsSource