A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches

Published on Oct 1, 2015in Applied Mathematics and Computation4.091
· DOI :10.1016/J.AMC.2015.07.019
Mohd Rivaie8
Estimated H-index: 8
(UiTM: Universiti Teknologi MARA),
Mustafa Mamat16
Estimated H-index: 16
(UniSZA: Universiti Sultan Zainal Abidin),
Abdelrhaman Abashar4
Estimated H-index: 4
(Red Sea University)
Sources
Abstract
Conjugate gradient (CG) methods have played an important role in solving large-scale unconstrained optimization. In this paper, we propose a new family of CG coefficients (βk) that possess sufficient descent conditions and global convergence properties. This new βk is an extension of the already proven β k RMIL from Rivaie et al. 19 (A new class of nonlinear conjugate gradient coefficient with global convergence properties, Appl. Math. Comp. 218(2012) 11323-11332). Global convergence result is established using both exact and inexact line searches. Numerical results show that the performance of the new proposed formula is quite similar to β k RMIL and suited to both line searches. Importantly, the performance of this βk is more efficient and superior than the other well-known βk.
📖 Papers frequently viewed together
2015
5 Authors (Awad Abdelrahman, ..., Osman Omer)
References33
Newest
#1Yu-Hong DaiH-Index: 37
#2CaiXia Kou (CAS: Chinese Academy of Sciences)H-Index: 3
In this paper, we seek the conjugate gradient direction closest to the direction of the scaled memoryless BFGS method and propose a family of conjugate gradient methods for unconstrained optimization. An improved Wolfe line search is also proposed, which can avoid a numerical drawback of the original Wolfe line search and guarantee the global convergence of the conjugate gradient method under mild conditions. To accelerate the algorithm, we introduce adaptive restarts along negative gradients ba...
Source
#1Mohd Rivaie (UiTM: Universiti Teknologi MARA)H-Index: 8
#2Mustafa Mamat (UMT: Universiti Malaysia Terengganu)H-Index: 16
Last. Ismail Mohd (UMT: Universiti Malaysia Terengganu)H-Index: 8
view all 4 authors...
Abstract Nonlinear conjugate gradient (CG) methods have played an important role in solving large-scale unconstrained optimization. Their wide application in many fields is due to their low memory requirements and global convergence properties. Numerous studies and modifications have been conducted recently to improve this method. In this paper, a new class of conjugate gradient coefficients ( β k ) that possess global convergence properties is presented. The global convergence result is establi...
Source
#1Yu-Hong Dai (CAS: Chinese Academy of Sciences)H-Index: 37
Conjugate gradient methods are a class of important methods for solving linear equations and for solving nonlinear optimization. In this article, a review on conjugate gradient methods for unconstrained optimization is given. They are divided into early conjugate gradient methods, descent conjugate gradient methods, and sufficient descent conjugate gradient methods. Two general convergence theorems are provided for the conjugate gradient method assuming the descent property of each search direct...
Source
Abstract Based on the modified secant equation, we propose two new HS type conjugate gradient formulas. Their forms are similar to the original HS conjugate gradient formula and inherit all nice properties of the HS method. By utilizing the technique of the three-term HS method in Zhang et al. (2007) [15] , without the requirement of truncation and convexity of the objective function, we show that one with Wolfe line search and the other with Armijo line search are globally convergent. Moreover,...
Source
#2Neculai AndreiH-Index: 21
The paper presents some open problems associated to the nonlin- ear conjugate gradient algorithms for unconstrained optimization. Mainly, these problems refer to the initial direction, the conjugacy condition, the step length computation, new formula for conjugate gradient parameter computation based on function's values, the inuence of accuracy of line search procedure, how we can take the problem's structure on conjugate gradient algorithms, how we can consider the second order information in ...
#1Zhen-Jun Shi (Central State University)H-Index: 16
#2Shengquan Wang (UM: University of Michigan)H-Index: 20
Last. Zhiwei Xu (UM: University of Michigan)H-Index: 11
view all 3 authors...
Abstract The conjugate gradient method is a useful and powerful approach for solving large-scale minimization problems. Liu and Storey developed a conjugate gradient method, which has good numerical performance but no global convergence under traditional line searches such as Armijo line search, Wolfe line search, and Goldstein line search. In this paper we propose a new nonmonotone line search for Liu-Storey conjugate gradient method (LS in short). The new nonmonotone line search can guarantee ...
Source
#1Gonglin Yuan (Xida: Guangxi University)H-Index: 3
#2Sha LuH-Index: 3
Last. Zengxin WeiH-Index: 1
view all 3 authors...
It is well known that the line search methods play a very important role for optimization problems. In this paper a new line search method is proposed for solving unconstrained optimization. Under weak conditions, this method possesses global convergence and R-linear convergence for nonconvex function and convex function, respectively. Moreover, the given search direction has sufficiently descent property and belongs to a trust region without carrying out any line search rule. Numerical results ...
Source
#1Gonglin Yuan (Xida: Guangxi University)H-Index: 8
#2Xiwen Lu (ECUST: East China University of Science and Technology)H-Index: 3
Last. Zengxin Wei (Xida: Guangxi University)H-Index: 21
view all 3 authors...
A modified conjugate gradient method is presented for solving unconstrained optimization problems, which possesses the following properties: (i) The sufficient descent property is satisfied without any line search; (ii) The search direction will be in a trust region automatically; (iii) The Zoutendijk condition holds for the Wolfe-Powell line search technique; (iv) This method inherits an important property of the well-known Polak-Ribiere-Polyak (PRP) method: the tendency to turn towards the ste...
Source
In this paper we propose a fundamentally different conjugate gradient method, in which the well-known parameter @b"k is computed by an approximation of the Hessian/vector product through finite differences. For search direction computation, the method uses a forward difference approximation to the Hessian/vector product in combination with a careful choice of the finite difference interval. For the step length computation we suggest an acceleration scheme able to improve the efficiency of the al...
Source
#1Gonglin Yuan (Xida: Guangxi University)H-Index: 1
#2Zengxin Wei (Xida: Guangxi University)H-Index: 1
Abstract It is well known that the search direction plays a main role in the line search method. In this paper, we propose a new search direction together with the Wolfe line search technique and one nonmonotone line search technique for solving unconstrained optimization problems. The given methods possess sufficiently descent property without carrying out any line search rule. The convergent results are established under suitable conditions. For numerical results, analysis of one probability s...
Source
Cited By20
Newest
#1Ting Zhao (Xidian University)H-Index: 1
#2Hongwei Liu (Xidian University)H-Index: 28
Last. Zexian Liu (CAS: Chinese Academy of Sciences)H-Index: 1
view all 3 authors...
In this paper, two new subspace minimization conjugate gradient methods based on p-regularization models are proposed, where a special scaled norm in p-regularization model is analyzed. Different choices of special scaled norm lead to different solutions to the p-regularized subproblem. Based on the analyses of the solutions in a two-dimensional subspace, we derive new directions satisfying the sufficient descent condition. With a modified nonmonotone line search, we establish the global converg...
Source
#2Xuetao Xie (Sichuan University)
#3Tao Gao (Beihang University)H-Index: 3
Abstract Elman recurrent network is a representative model with feedback mechanism. Although gradient descent method has been widely used to train Elman network, it frequently leads to slow convergence. According to optimization theory, conjugate gradient method is an alternative strategy in searching the descent direction during training. In this paper, an efficient conjugate gradient method has been presented to reach the optimal solution in two ways: (1) constructing a more effective conjugat...
Source
#1Jaafar HammoudH-Index: 1
#2Ali EisaH-Index: 1
Last. Natalia GusarovaH-Index: 4
view all 4 authors...
Gradient methods have applications in multiple fields, including signal processing, image processing, and dynamic systems. In this paper, we present a nonlinear gradient method for solving convex supra-quadratic functions by developing the search direction, that done by hybridizing between the two conjugate coefficients HRM [2] and NHS [1]. The numerical results proved the effectiveness of the presented method by applying it to solve standard problems and reaching the exact solution if the objec...
#2Mustafa MamatH-Index: 16
Last. Maulana MalikH-Index: 4
view all 5 authors...
Source
#1Xinliu Diao (Xidian University)
#2Hongwei Liu (Xidian University)H-Index: 2
Last. Zexian Liu (CAS: Chinese Academy of Sciences)
view all 3 authors...
In this paper, a new subspace minimization conjugate gradient method based on modified secant equation is proposed and analyzed. For a classical subspace minimization conjugate gradient method, the search direction is derived by minimizing an approximate quadratic model of objective function in a two-dimensional subspace. Generally, the approximate Hessian matrix in the above quadratic model is required to satisfy the standard secant equation, while we consider an approximate Hessian matrix whic...
Source
#1Siti Farhana HusinH-Index: 1
Last. Mohd RivaieH-Index: 8
view all 4 authors...
This study employs exact line search iterative algorithms for solving large scale unconstrained optimization problems in which the direction is a three-term modification of iterative method with two different scaled parameters. The objective of this research is to identify the effectiveness of the new directions both theoretically and numerically. Sufficient descent property and global convergence analysis of the suggested methods are established. For numerical experiment purposes, the methods a...
Source
#1Maulana MalikH-Index: 4
#2Mustafa Mamat (UniSZA: Universiti Sultan Zainal Abidin)H-Index: 16
Last. Sukono (UNPAD: Padjadjaran University)H-Index: 7
view all 4 authors...
#2Mustafa MamatH-Index: 16
#3Mohd RivaieH-Index: 8
view all 4 authors...
Source
#1Ting Wang (Xidian University)H-Index: 1
#2Zexian Liu (Xidian University)H-Index: 1
Last. Hongwei Liu (Xidian University)H-Index: 16
view all 3 authors...
In this paper, we present a new conjugate gradient method, in which the search direction is computed by minimizing a selected approximate model in a two-dimensional subspace. That is, if the objective function is not close to a quadratic, the search direction is generated by a conic model. Otherwise, a quadratic model is considered. The direction of the proposed method is proved to possess the sufficient descent property. With the modified nonmonotone line search, we establish a global convergen...
Source
Source
This website uses cookies.
We use cookies to improve your online experience. By continuing to use our website we assume you agree to the placement of these cookies.
To learn more, you can find in our Privacy Policy.