Some Three-Term Conjugate Gradient Algorithms with Descent Condition for Unconstrained Optimization Models

Published on Jan 1, 2020
· DOI :10.5373/JARDCS/V12I2/S20201297
Mustafa Mamat16
Estimated H-index: 16
Source
Abstract
References5
Newest
#1J. K. Liu (CTGU: Chongqing Three Gorges University)H-Index: 4
#2Yuming Feng (CTGU: Chongqing Three Gorges University)H-Index: 11
Last. Limin Zou (CTGU: Chongqing Three Gorges University)H-Index: 10
view all 3 authors...
The three-term conjugate gradient methods solving large-scale optimization problems are favored by many researchers because of their nice descent and convergent properties. In this paper, we extend some new conjugate gradient methods, and construct some three-term conjugate gradient methods. An remarkable property of the proposed methods is that the search direction always satisfies the sufficient descent condition without any line search. Under the standard Wolfe line search, the global converg...
Source
#2Mustafa MamatH-Index: 16
Last. Zabidin SallehH-Index: 11
view all 5 authors...
Nonlinear conjugate gradient method holds an important role in solving large scale unconstrained optimization problems. Their simplicity, low memory requirement, and global convergence stimulated a massive study on the method. Numerous modifications have been done recently to improve its performance. In this paper, we proposed a new formula for the conjugate gradient coefficient k  that generates the descent search direction. In addition, we establish the global convergence result under exact l...
Source
#2Mustafa MamatH-Index: 16
Last. Zabidin SallehH-Index: 11
view all 6 authors...
Conjugate gradient (CG) methods have played a significant role in solving large scale unconstrained optimization. This is due to its simplicity, low memory requirement, and global convergence properties. Various studies and modifications have been done recently to improve this method. In this paper, we proposed a new conjugate gradient parameter ) ( k  which possesses global convergence properties under the exact line search. Numerical result shows that our new formula performs better when comp...
Source
#1Mohd RivaieH-Index: 8
#2Mustafa MamatH-Index: 16
Last. Ismail MohdH-Index: 8
view all 4 authors...
Conjugate gradient (CG) methods have played an important role in solving largescale unconstrained optimization due to its low memory requirements and global convergence properties. Numerous studies and modifications have been devoted recently to improve this method. In this paper, a new modification of conjugate gradient coefficient ( k β ) with global convergence properties are presented. The global convergence result is established using exact line searches. Preliminary result shows that the p...
#1M. J. D. PowellH-Index: 56
We consider the global convergence of conjugate gradient methods without restarts, assuming exact arithmetic and exact line searches, when the objective function is twice continuously differentiable and has bounded level sets. Most of our attention is given to the Polak-Ribiere algorithm, and unfortunately we find examples that show that the calculated gradients can remain bounded away from zero. The examples that have only two variables show also that some variable metric algorithms for unconst...
Source
Cited By1
Newest
#1Ibrahim Mohammed Sulaiman (UniSZA: Universiti Sultan Zainal Abidin)H-Index: 6
#2Mustafa Mamat (UniSZA: Universiti Sultan Zainal Abidin)H-Index: 16
Last. Maulana MalikH-Index: 4
view all 6 authors...
Source
This website uses cookies.
We use cookies to improve your online experience. By continuing to use our website we assume you agree to the placement of these cookies.
To learn more, you can find in our Privacy Policy.