The convergence properties of some descent conjugate gradient algorithms for optimization models

Published on Jul 31, 2020
· DOI :10.22436/JMCS.022.03.02
Ibrahim Mohammed Sulaiman4
Estimated H-index: 4
(UniSZA: Universiti Sultan Zainal Abidin),
Mustafa Mamat15
Estimated H-index: 15
(UniSZA: Universiti Sultan Zainal Abidin)
+ 3 AuthorsMaulana Malik4
Estimated H-index: 4
Sources
Abstract
📖 Papers frequently viewed together
2010CI: Computational Intelligence
5 Citations
2010CCDC: Chinese Control and Decision Conference
5 Citations
2012CCDC: Chinese Control and Decision Conference
3 Authors (Zhongbo Sun, ..., Haiyin Gao)
2 Citations
References11
Newest
#1J.K. Liu (CTGU: Chongqing Three Gorges University)H-Index: 1
#2Y.X. Zhao (CTGU: Chongqing Three Gorges University)H-Index: 1
Last. X.L. Wu (CTGU: Chongqing Three Gorges University)H-Index: 1
view all 3 authors...
Abstract In this paper, a three-term conjugate gradient method with the new direction structure is proposed for solving large-scale unconstrained optimization problems, which generates a sufficient descent direction in per-iteration by the aid of some inexact line search conditions. Under suitable assumptions, the proposed method is globally convergent for nonconvex smooth problems. We further generalize the new direction structure to other traditional methods and obtain some algorithms with the...
6 CitationsSource
#1Mustafa MamatH-Index: 15
1 CitationsSource
#2Mustafa MamatH-Index: 15
Last. Zabidin SallehH-Index: 8
view all 6 authors...
Conjugate gradient (CG) methods have played a significant role in solving large scale unconstrained optimization. This is due to its simplicity, low memory requirement, and global convergence properties. Various studies and modifications have been done recently to improve this method. In this paper, we proposed a new conjugate gradient parameter ) ( k  which possesses global convergence properties under the exact line search. Numerical result shows that our new formula performs better when comp...
7 CitationsSource
A simple three-term conjugate gradient algorithm which satisfies both the descent condition and the conjugacy condition is presented. This algorithm is a modification of the Hestenes and Stiefel algorithm (Hestenes and Stiefel, 1952) [10], or that of Hager and Zhang (Hager and Zhang, 2005) [23] in such a way that the search direction is descent and it satisfies the conjugacy condition. These properties are independent of the line search. Also, the algorithm could be considered as a modification ...
46 CitationsSource
Conjugate gradient methods are widely used for unconstrained optimization, especially large scale problems. The strong Wolfe conditions are usually used in the analyses and implementations of conjugate gradient methods. This paper presents a new version of the conjugate gradient method, which converges globally, provided the line search satisfies the standard Wolfe conditions. The conditions on the objective function are also weak, being similar to those required by the Zoutendijk condition.
736 CitationsSource
This paper investigates the global convergence properties of the Fletcher-Reeves (FR) method for unconstrained optimization. In a simple way, we prove that a kind of inexact line search condition can ensure the convergence of the FR method. Several examples are constructed to show that, if the search conditions are relaxed, the FR method may produce an ascent search direction, which implies that our result cannot be improved.
90 CitationsSource
3,485 CitationsSource
Si on utilise une recherche de ligne inexacte qui satisfait des conditions standards on peut demontrer que la methode de Fletcher-Reeves a une propriete de descente et est globalement convergente en un certain sens
352 CitationsSource
Abstract THE conjugate gradient method was first described in [1, 2] for solving sets of linear algebraic equations. The method, being iterative in form, has all the merits of iterative methods, and enables a set of linear equations to be solved (or what amounts to the same thing, the minimum of a quadratic functional in finite-dimensional space to be found) after a finite number of steps. The method was later extended to the case of Hilbert space [3–5], and to the case of non-quadratic function...
630 CitationsSource
3,589 CitationsSource
Cited By4
Newest
#1Ibrahim Mohammed Sulaiman (UniSZA: Universiti Sultan Zainal Abidin)H-Index: 4
Last. Alomari Mohammad Ahmed (UniSZA: Universiti Sultan Zainal Abidin)
view all 6 authors...
The hybrid conjugate gradient (CG) method is among the efficient variants of CG method for solving optimization problems. This is due to their low memory requirements and nice convergence properties. In this paper, we present an efficient hybrid CG method for solving unconstrained optimization models and show that the method satisfies the sufficient descent condition. The global convergence prove of the proposed method would be established under inexact line search. Application of the proposed m...
Source
#1Maulana Malik (UI: University of Indonesia)H-Index: 4
#2Auwal Bala Abubakar (Sefako Makgatho Health Sciences University)H-Index: 12
Last. Sukono (UNPAD: Padjadjaran University)H-Index: 5
view all 6 authors...
#2Mustafa MamatH-Index: 15
Last. Maulana MalikH-Index: 4
view all 5 authors...
2 CitationsSource