Convergence analysis of a new coefficient conjugate gradient method under exact line search

Maulana Malik4
Estimated H-index: 4
,
Mustafa Mamat16
Estimated H-index: 16
(UniSZA: Universiti Sultan Zainal Abidin)
+ 1 AuthorsSukono7
Estimated H-index: 7
(UNPAD: Padjadjaran University)
Source
Abstract
馃摉 Papers frequently viewed together
2012
3 Authors (Mohd Rivaie, ..., Mustafa Mamat)
References20
Newest
Abstract In Dai (2016), based on the global convergence of RMIL conjugate gradient method, Dai has modified it and called the modified version RMIL+ which has good numerical results and globally convergent under the exact line search. In this paper, we established the sufficient descent property and the global convergence of RMIL+ via strong Wolfe line search method. Moreover, numerical results based on well-known optimization problems show that the modified method is competitive when compare wi...
Source
#1Gonglin Yuan (Xida: Guangxi University)H-Index: 3
#2Tingting Li (Xida: Guangxi University)H-Index: 2
Last. Wujie Hu (Xida: Guangxi University)H-Index: 4
view all 3 authors...
Abstract Nonlinear systems present a quite complicate problem. As the number of dimensions increases, it becomes more difficult to find the solution of the problem. In this paper, a modified conjugate gradient method is designed that has a sufficient descent property and trust region property. It is interesting that the formula for search direction makes full use of the property of convex combination between the deepest descent algorithm and the classical LS conjugate gradient (CG) method. The g...
Source
#1Mohammed Yusuf Waziri (Bayero University Kano)H-Index: 8
#2Kabiru Ahmed (Bayero University Kano)H-Index: 5
Last. Jamilu Sabi鈥檜 (Northwest University (United States))H-Index: 8
view all 3 authors...
Abstract This paper presents two modified Hager鈥揨hang (HZ) Conjugate Gradient methods for solving large-scale system of monotone nonlinear equations. The methods were developed by combining modified forms of the one-parameter method by Hager and Zhang (2006) and the hyperplane projection technique. Global convergence and numerical results of the methods are established. Preliminary numerical results show that the proposed methods are promising and more efficient compared to the methods presented...
Source
#1Srimazzura Basri (UniSZA: Universiti Sultan Zainal Abidin)H-Index: 2
Abstract Nonlinear conjugate gradient methods are widely used in solving large scale unconstrained optimization. Their wide application in many fields are due to their low memory. Numerous studies have been conducted recently to improve these methods. In this paper, a new class of conjugate gradient coefficients that possess global convergence properties is proposed. The global convergence result using exact line searches are discussed. Numerical result shows that the proposed method is more eff...
Source
#1Mohd Rivaie (UiTM: Universiti Teknologi MARA)H-Index: 8
#2Mustafa Mamat (UniSZA: Universiti Sultan Zainal Abidin)H-Index: 16
Last. Abdelrhaman Abashar (Red Sea University)H-Index: 4
view all 3 authors...
Conjugate gradient (CG) methods have played an important role in solving large-scale unconstrained optimization. In this paper, we propose a new family of CG coefficients (尾k) that possess sufficient descent conditions and global convergence properties. This new 尾k is an extension of the already proven 尾 k RMIL from Rivaie et al. 19 (A new class of nonlinear conjugate gradient coefficient with global convergence properties, Appl. Math. Comp. 218(2012) 11323-11332). Global convergence result is e...
Source
#1Dongyi Liu (TJU: Tianjin University)H-Index: 5
#2Liping Zhang (TJU: Tianjin University)H-Index: 2
Last. Genqi Xu (TJU: Tianjin University)H-Index: 21
view all 3 authors...
A new method used to prove the global convergence of the nonlinear conjugate gradient methods, the spectral method, is presented in this paper, and it is applied to a new conjugate gradient algorithm with sufficiently descent property. By analyzing the descent property, several concrete forms of this algorithm are suggested. Under standard Wolfe line searches, the global convergence of the new algorithm is proven for nonconvex functions. Preliminary numerical results for a set of 720 unconstrain...
Source
Following the scaled conjugate gradient methods proposed by Andrei, we hybridize the memoryless BFGS preconditioned conjugate gradient method suggested by Shanno and the spectral conjugate gradient method suggested by Birgin and Martinez based on a modified secant equation suggested by Yuan, and propose two modified scaled conjugate gradient methods. The interesting features of our methods are applying the function values in addition to the gradient values and satisfying the sufficient descent c...
Source
#1Jinkui Liu (CTGU: Chongqing Three Gorges University)H-Index: 1
Conjugate gradient methods are a class of important methods for unconstrained optimization problems, especially when the dimension is large. In this paper, we study a class of modified conjugate gradient methods based on the famous LS conjugate gradient method, which produces a sufficient descent direction at each iteration and converges globally provided that the line search satisfies the strong Wolfe condition. At the same time, a new specific nonlinear conjugate gradient method is constructed...
Source
#1Xiangfei Yang (Hunan University of Humanities, Science and Technology)H-Index: 1
#2Zhijun Luo (Hunan University of Humanities, Science and Technology)H-Index: 1
Last. Xiaoyu Dai (Hunan University of Humanities, Science and Technology)H-Index: 1
view all 3 authors...
Conjugate gradient method is one of the most effective algorithms for solving unconstrained optimization problem. In this paper, a modified conjugate gradient method is presented and analyzed which is a hybridization of known LS and CD conjugate gradient algorithms. Under some mild conditions, the Wolfe-type line search can guarantee the global convergence of the LS-CD method. The numerical results show that the algorithm is efficient.
Source
#1Mohd Rivaie (UiTM: Universiti Teknologi MARA)H-Index: 8
#2Mustafa Mamat (UMT: Universiti Malaysia Terengganu)H-Index: 16
Last. Ismail Mohd (UMT: Universiti Malaysia Terengganu)H-Index: 8
view all 4 authors...
Abstract Nonlinear conjugate gradient (CG) methods have played an important role in solving large-scale unconstrained optimization. Their wide application in many fields is due to their low memory requirements and global convergence properties. Numerous studies and modifications have been conducted recently to improve this method. In this paper, a new class of conjugate gradient coefficients ( 尾 k ) that possess global convergence properties is presented. The global convergence result is establi...
Source
Cited By3
Newest
#1Basim A. HassanH-Index: 4
Last. Abdulkarim Hassan IbrahimH-Index: 12
view all 4 authors...
Source
#1Xiang-Min Liu (Jiangxi University of Science and Technology)
#2Jian Hu (Jiangxi University of Science and Technology)
Last. Zhi-Gang Chen (CSU: Central South University)
view all 7 authors...
Source
This website uses cookies.
We use cookies to improve your online experience. By continuing to use our website we assume you agree to the placement of these cookies.
To learn more, you can find in our Privacy Policy.