A new type of descent conjugate gradient method with exact line search

Published on Jun 2, 2016
路 DOI :10.1063/1.4952569
Nurul Hajar2
Estimated H-index: 2
,
Mustafa Mamat16
Estimated H-index: 16
+ 1 AuthorsIbrahim Jusoh2
Estimated H-index: 2
Sources
Abstract
Nowadays, conjugate gradient (CG) methods are impressive for solving nonlinear unconstrained optimization problems. In this paper, a new CG method is proposed and analyzed. This new CG method satisfies descent condition and its global convergence is established using exact line search. Numerical results show that this new CG method substantially outperforms the previous CG methods. This new CG method is considered robust, efficient and provided faster and stable convergence.
馃摉 Papers frequently viewed together
2017
7 Authors (N Shapiee, ..., Mustafa Mamat)
References13
Newest
#1Osman OmerH-Index: 2
#2Mustafa MamatH-Index: 16
Last. Mohd RivaieH-Index: 8
view all 4 authors...
Conjugate gradient methods are the most famous methods for solving nonlinear unconstrained optimization problems, especially large scale problems. That is, for its simplicity and low memory requirement. The strong Wolfe line search are usually used in practice for the analyses and implementations of conjugate gradient methods. In this paper, we present a new method of nonlinear conjugate gradient method with strong Wolfe line search for unconstrained optimization problems. Under some assumptions...
Source
#2Mustafa MamatH-Index: 16
Last. Ismail MohdH-Index: 8
view all 4 authors...
The classical steepest descent (SD) method is known as one of the earliest and the best method to minimize a function. Even though the convergence rate is quite slow, but its simplicity has made it one of the easiest methods to be used and applied especially in the form of computer codes. In this paper, a new modification of SD method is proposed using a new search direction (dk) in the form of two parameters. Numerical results shows that this new SD has far superior convergence rate and more ef...
Source
#2Mustafa MamatH-Index: 16
Last. Ismail MohdH-Index: 8
view all 4 authors...
Conjugate gradient (CG) methods represent an important computational innovation in solving large-scaled unconstrained optimization problems. There are many different versions of CG methods. Although some methods are equivalent to each other, their performances are quite different. This paper presents a new CG method based on modification of the original CG methods. The important criteria of this new CG method are its global convergence properties. Numerical result shows that this new CG method p...
Source
#2Mustafa MamatH-Index: 16
Last. Osman OmerH-Index: 2
view all 5 authors...
Conjugate gradient methods are effective in solving linear equations and solving non-linear optimization. In this work we compare our new conjugate gradient coefficient 尾k with classical formula under strong Wolfe line search; our method contains sufficient descent condition. Numerical results have shown that the new 尾k performs better than classical formula.Conjugate gradient methods are effective in solving linear equations and solving non-linear optimization. In this work we compare our new c...
Source
#1Norrlaili ShapieeH-Index: 3
#2Mohd RivaieH-Index: 8
Last. Ismail MohdH-Index: 4
view all 4 authors...
Conjugate gradient (CG) methods are important for large-scale unconstrained optimization due to its low memory requirements and global convergence properties. Numerous researches has been done to proposed new CG coefficients and to improve the efficiency. In this paper, we proposed a new CG coefficient based on the original Hestenes-Steifel CG coefficient. The global convergence result is established using exact line search. Most of our numerical results show that our method is very efficient wh...
Source
#1Mohd Rivaie (UiTM: Universiti Teknologi MARA)H-Index: 8
#2Mustafa Mamat (UMT: Universiti Malaysia Terengganu)H-Index: 16
Last. Ismail Mohd (UMT: Universiti Malaysia Terengganu)H-Index: 8
view all 4 authors...
Abstract Nonlinear conjugate gradient (CG) methods have played an important role in solving large-scale unconstrained optimization. Their wide application in many fields is due to their low memory requirements and global convergence properties. Numerous studies and modifications have been conducted recently to improve this method. In this paper, a new class of conjugate gradient coefficients ( 尾 k ) that possess global convergence properties is presented. The global convergence result is establi...
Source
#1Hideaki Iiduka (Kyushu Institute of Technology)H-Index: 18
#2Yasushi NarushimaH-Index: 9
Conjugate gradient methods have been widely used as schemes to solve large-scale unconstrained optimization problems. The search directions for the conventional methods are defined by using the gradient of the objective function. This paper proposes two nonlinear conjugate gradient methods which take into account mostly information about the objective function. We prove that they converge globally and numerically compare them with conventional methods. The results show that with slight modificat...
Source
#1Jiawei Zhang (ZJU: Zhejiang University)H-Index: 1
#2Binghe Chen (ZJU: Zhejiang University)H-Index: 1
We propose a new iterative method to solve the boundary value problems (BVPs) of the Falkner-Skan equation over a semi-infinite interval. In our approach, we use the free boundary formulation to truncate the semi-infinite interval into a finite one. Then we use the shooting method to transform the BVP into initial value problems (IVPs). In order to find the ''shooting angle'' and the unknown free boundary, a modification of the classical Newton's method is used where the Jacobian matrix can be a...
Source
#1Elizabeth D. Dolan (Argonne National Laboratory)H-Index: 3
#2Jorge J. Mor茅 (Argonne National Laboratory)H-Index: 51
We propose performance profiles 鈥 distribution functions for a performance metric 鈥 as a tool for benchmarking and comparing optimization software. We show that performance profiles combine the best features of other tools for performance evaluation.
Source
Conjugate gradient methods are widely used for unconstrained optimization, especially large scale problems. The strong Wolfe conditions are usually used in the analyses and implementations of conjugate gradient methods. This paper presents a new version of the conjugate gradient method, which converges globally, provided the line search satisfies the standard Wolfe conditions. The conditions on the objective function are also weak, being similar to those required by the Zoutendijk condition.
Source
Cited By7
Newest
#1Siti Farhana HusinH-Index: 1
Last. Mohd RivaieH-Index: 8
view all 4 authors...
This study employs exact line search iterative algorithms for solving large scale unconstrained optimization problems in which the direction is a three-term modification of iterative method with two different scaled parameters. The objective of this research is to identify the effectiveness of the new directions both theoretically and numerically. Sufficient descent property and global convergence analysis of the suggested methods are established. For numerical experiment purposes, the methods a...
Source
#1Maulana MalikH-Index: 4
#2Mustafa Mamat (UniSZA: Universiti Sultan Zainal Abidin)H-Index: 16
Last. Abdul Talib Bon (UTHM: Universiti Tun Hussein Onn Malaysia)H-Index: 9
view all 6 authors...
Source
#1Norhaslinda Zull (UniSZA: Universiti Sultan Zainal Abidin)H-Index: 1
#3Syazni Shoid (UniSZA: Universiti Sultan Zainal Abidin)H-Index: 2
Last. Mustafa Mamat (UniSZA: Universiti Sultan Zainal Abidin)H-Index: 16
view all 7 authors...
The conjugate gradient (CG) method is one of the optimization methods that are often used in practical applications. The continuous and numerous studies conducted on the CG method have led to vast improvements in its convergence properties and efficiency. In this paper, a new CG method possessing the sufficient descent and global convergence properties is proposed. The efficiency of the new CG algorithm relative to the existing CG methods is evaluated by testing them all on a set of test functio...
Source
#3N Shapiee (UniSZA: Universiti Sultan Zainal Abidin)H-Index: 1
#6Mohd Rivaie (UiTM: Universiti Teknologi MARA)H-Index: 8
Last. Mustafa Mamat (UniSZA: Universiti Sultan Zainal Abidin)H-Index: 16
view all 7 authors...
Conjugate gradient (CG) method is an evolution of computational method in solving unconstrained optimization problems. This approach is easy to implement due to its simplicity and has been proven to be effective in solving real-life application. Although this field has received copious amount of attentions in recent years, some of the new approaches of CG algorithm cannot surpass the efficiency of the previous versions. Therefore, in this paper, a new CG coefficient which retains the sufficient ...
Source
#2Mustafa MamatH-Index: 16
Conjugate gradient (CG) method is an important technique in unconstrained optimization, due to its effectiveness and low memory requirements. The focus of this paper is to introduce a new CG method for solving large scale unconstrained optimization. Theoretical proofs show that the new method fulfills sufficient descent condition if strong Wolfe-Powell inexact line search is used. Besides, computational results show that our proposed method outperforms to other existing CG methods.
Source
#1Wan KhadijahH-Index: 1
#2Mohd RivaieH-Index: 8
Recently, numerous studies have been concerned in conjugate gradient methods for solving large-scale unconstrained optimization method. In this paper, a three-term conjugate gradient method is proposed for unconstrained optimization which always satisfies sufficient descent direction and namely as Three-Term Rivaie-Mustafa-Ismail-Leong (TTRMIL). Under standard conditions, TTRMIL method is proved to be globally convergent under strong-Wolfe line search. Finally, numerical results are provided for...
Source
This website uses cookies.
We use cookies to improve your online experience. By continuing to use our website we assume you agree to the placement of these cookies.
To learn more, you can find in our Privacy Policy.