A new sufficient descent conjugate gradient method with exact line search

Published on Dec 1, 2019
路 DOI :10.1063/1.5136479
Nur Idalisa , Mohd Rivaie8
Estimated H-index: 8
+ 1 AuthorsMohd Agos Salim Nasir5
Estimated H-index: 5
Sources
Abstract
馃摉 Papers frequently viewed together
2011
1 Author (Yu-Hong Dai)
References12
Newest
#1Muhammed Jassem Al-Muhammed (American University of Madaba)H-Index: 4
#2Raed Abu Zitar (College of Information Technology)H-Index: 9
Abstract Devising ways for handling problem optimization is an important yet a challenging task. The aims are always for methods that can effectively and quickly discover the global optimum for rather complicated mathematical functions that model real-world settings. Typically these functions are too difficult to discover their global optima because they may (1) lack the continuity and differentiability, (2) have multiple local optima, and (3) have complex expressions. In this paper, we address ...
Source
#1J. K. Liu (CTGU: Chongqing Three Gorges University)H-Index: 4
#2Yuming Feng (CTGU: Chongqing Three Gorges University)H-Index: 11
Last. Limin Zou (CTGU: Chongqing Three Gorges University)H-Index: 10
view all 3 authors...
The three-term conjugate gradient methods solving large-scale optimization problems are favored by many researchers because of their nice descent and convergent properties. In this paper, we extend some new conjugate gradient methods, and construct some three-term conjugate gradient methods. An remarkable property of the proposed methods is that the search direction always satisfies the sufficient descent condition without any line search. Under the standard Wolfe line search, the global converg...
Source
#1Vahid Beiranvand (UBC: University of British Columbia)H-Index: 4
#2Warren Hare (UBC: University of British Columbia)H-Index: 23
Last. Yves Lucet (UBC: University of British Columbia)H-Index: 19
view all 3 authors...
Comparing, or benchmarking, of optimization algorithms is a complicated task that involves many subtle considerations to yield a fair and unbiased evaluation. In this paper, we systematically review the benchmarking process of optimization algorithms, and discuss the challenges of fair comparison. We provide suggestions for each step of the comparison process and highlight the pitfalls to avoid when evaluating the performance of optimization algorithms. We also discuss various methods of reporti...
Source
Test functions are important to validate and compare the performance of optimization algorithms. There have been many test or benchmark functions reported in the literature; however, there is no standard list or set of benchmark functions. Ideally, test functions should have diverse properties so that can be truly useful to test new algorithms in an unbiased way. For this purpose, we have reviewed and compiled a rich set of 175 benchmark functions for unconstrained optimization problems with div...
Source
#1Momin Jamil (BTH: Blekinge Institute of Technology)H-Index: 6
#2Xin-She Yang (Middlesex University)H-Index: 88
Test functions are important to validate and compare the performance of optimisation algorithms. There have been many test or benchmark functions reported in the literature; however, there is no standard list or set of benchmark functions. Ideally, test functions should have diverse properties to be truly useful to test new algorithms in an unbiased way. For this purpose, we have reviewed and compiled a rich set of 175 benchmark functions for unconstrained optimisation problems with diverse prop...
Source
#1M. Asghar BhattiH-Index: 9
1 Optimization Problem Formulation.- 1.1 Optimization Problem Formulation.- 1.2 The Standard Form of an Optimization Problem.- 1.3 Solution of Optimization Problems.- 1.4 Time Value of Money.- 1.5 Concluding Remarks.- 1.6 Problems.- 2 Graphical Optimization.- 2.1 Procedure for Graphical Optimization.- 2.2 GraphicalSolution function.- 2.3 Graphical Optimization Examples.- 2.4 Problems.- 3 Mathematical Preliminaries.- 3.1 Vectors and Matrices.- 3.2 Approximation Using the Taylor Series.- 3.3 Solut...
#1Li Zhang (CSUST: Changsha University of Science and Technology)H-Index: 3
#2Weijun Zhou (Hunan University)H-Index: 5
Last. Dong-Hui Li (Hunan University)H-Index: 17
view all 3 authors...
In this paper, we propose a three-term conjugate gradient method which can produce sufficient descent condition, that is, [image omitted]聽. This property is independent of any line search used. When an exact line search is used, this method reduces to the standard Hestenes-Stiefel conjugate gradient method. We also introduce two variants of the proposed method which still preserve the sufficient descent property, and prove that these two methods converge globally with standard Wolfe line search ...
Source
#1Elizabeth D. Dolan (Argonne National Laboratory)H-Index: 3
#2Jorge J. Mor茅 (Argonne National Laboratory)H-Index: 51
We propose performance profiles 鈥 distribution functions for a performance metric 鈥 as a tool for benchmarking and comparing optimization software. We show that performance profiles combine the best features of other tools for performance evaluation.
Source
The conjugate gradient method is particularly useful for minimizing functions of very many variables because it does not require the storage of any matrices. However the rate of convergence of the algorithm is only linear unless the iterative procedure is "restarted" occasionally. At present it is usual to restart everyn or (n + 1) iterations, wheren is the number of variables, but it is known that the frequency of restarts should depend on the objective function. Therefore the main purpose of t...
Source
#1E. PolakH-Index: 1
#2G. RibiereH-Index: 1
Source
Cited By0
Newest
This website uses cookies.
We use cookies to improve your online experience. By continuing to use our website we assume you agree to the placement of these cookies.
To learn more, you can find in our Privacy Policy.