A Class New Hybrid Conjugate Gradient Method for Unconstrained Optimization

Published on Mar 20, 2015in The Journal of Information and Computational Science
路 DOI :10.12733/JICS20105721
Yunlong Lu1
Estimated H-index: 1
,
Wenyu Li2
Estimated H-index: 2
+ 1 AuthorsYueting Yang1
Estimated H-index: 1
Sources
Abstract
In this paper, A class new parameter conjugate gradient method and a new hybrid conjugate gradient method are proposed. The global convergence of the algorithms are proved under the Wolfe line search without the descent condition. Numerical experiments show that the hybrid conjugate gradient algorithm is recommendable.
馃摉 Papers frequently viewed together
References15
Newest
#1Saman Babaie-Kafaki (Semnan University)H-Index: 15
#2Nezam Mahdavi-Amiri (Sharif University of Technology)H-Index: 24
Abstract Taking advantage of the attractive features of Hestenes鈥揝tiefel and Dai鈥揧uan conjugate gradient methods, we suggest two globally convergent hybridizations of these methods following Andrei's approach of hybridizing the conjugate gradient parameters convexly and Powell's approach of nonnegative restriction of the conjugate gradient parameters. In our methods, the hybridization parameter is obtained based on a recently proposed hybrid secant equation. Numerical results demonstrating the e...
Source
We propose and generalize a new nonlinear conjugate gradient method for unconstrained optimization. The global convergence is proved with the Wolfe line search. Numerical experiments are reported which support the theoretical analyses and show the presented methods outperforming CGDESCENT method.
Source
#1Wenyu SunH-Index: 6
#2Ya-xiang YuanH-Index: 35
Preface 1 Introduction 1.1 Introduction 1.2 Mathematics Foundations 1.2.1 Norm 1.2.2 Inverse and Generalized Inverse of a Matrix 1.2.3 Properties of Eigenvalues 1.2.4 Rank-One Update 1.2.5 Function and Differential 1.3 Convex Sets and Convex Functions 1.3.1 Convex Sets 1.3.2 Convex Functions 1.3.3 Separation and Support of Convex Sets 1.4 Optimality Conditions for Unconstrained Case 1.5 Structure of Optimization Methods Exercises 2 Line Search 2.1 Introduction 2.2 Convergence Theory for Exact Li...
#1Neculai AndreiH-Index: 21
A collection of unconstrained optimization test functions is presented. The purpose of this collection is to give to the optimization community a large number of general test functions to be used in testing the unconstrained optimization algorithms and comparisons studies. For each function we give its algebraic expression and the standard initial point. Some of the test fnctions are from the CUTE collection established by Bongartz, Conn, Gould and Toint, (1995), others are from More, Garbow and...
#1Gaohang Yu (SYSU: Sun Yat-sen University)H-Index: 17
#2Yanlin Zhao (Xida: Guangxi University)H-Index: 1
Last. Zengxin Wei (Xida: Guangxi University)H-Index: 21
view all 3 authors...
In this paper, a new nonlinear conjugate gradient method was proposed for large-scale unconstrained optimization which possesses the following three properties: (i) the sufficient descent property holds without any line searches; (ii) employing some steplength technique which ensures the Zoutendijk condition to be held, this method is globally convergent; (iii) this method inherits an important property of the Polak-Ribiere-Polyak (PRP) method: the tendency to turn towards the steepest descent d...
Source
In this paper, we propose a class of new descent methods and also we give two hybrids methods based on Hestenes-Stiefel and our new methods. And, we proved their global congenvence in Wolfe line search without descent condition. Numerical experiments shows that our methods are very efficient, especially for large scale problems.
#1Nicholas I. M. Gould (RAL: Rutherford Appleton Laboratory)H-Index: 56
#2Dominique Orban (脡cole Polytechnique de Montr茅al)H-Index: 19
Last. Phillipe L. Toint (Universit茅 de Namur)H-Index: 1
view all 4 authors...
In this paper, we examine the sensitivity of trust-region algorithms on the parameters related to the step acceptance and update of the trust region. We show, in the context of unconstrained programming, that the numerical efficiency of these algorithms can easily be improved by choosing appropriate parameters. Recommended ranges of values for these parameters are exhibited on the basis of extensive numerical tests.
Source
#1Nicholas I. M. Gould (RAL: Rutherford Appleton Laboratory)H-Index: 56
#2Dominique Orban (NU: Northwestern University)H-Index: 19
Last. Philippe L. Toint (Universit茅 de Namur)H-Index: 49
view all 3 authors...
The initial release of CUTE, a widely used testing environment for optimization software, was described by Bongartz, et al. [1995]. A new version, now known as CUTEr, is presented. Features include reorganisation of the environment to allow simultaneous multi-platform installation, new tools for, and interfaces to, optimization packages, and a considerably simplified and entirely automated installation procedure for unix systems. The environment is fully backward compatible with its predecessor,...
Source
#1Elizabeth D. Dolan (Argonne National Laboratory)H-Index: 3
#2Jorge J. Mor茅 (Argonne National Laboratory)H-Index: 51
We propose performance profiles 鈥 distribution functions for a performance metric 鈥 as a tool for benchmarking and comparing optimization software. We show that performance profiles combine the best features of other tools for performance evaluation.
Source
#1Yu-Hong Dai (CAS: Chinese Academy of Sciences)H-Index: 37
#2Ya-xiang Yuan (CAS: Chinese Academy of Sciences)H-Index: 35
In this paper, we propose a three-parameter family of conjugate gradient methods for unconstrained optimization. The three-parameter family of methods not only includes the already existing six practical nonlinear conjugate gradient methods, but subsumes some other families of nonlinear conjugate gradient methods as its subfamilies. With Powell's restart criterion, the three-parameter family of methods with the strong Wolfe line search is shown to ensure the descent property of each search direc...
Source
Cited By2
Newest
#1Wenyu Li (Beihua University)H-Index: 1
#2Yueting Yang (Beihua University)H-Index: 2
A nonmonotone hybrid conjugate gradient method is proposed, in which the technique of the nonmonotone Wolfe line search is used. Under mild assumptions, we prove the global convergence and linear convergence rate of the method. Numerical experiments are reported.
Source
This website uses cookies.
We use cookies to improve your online experience. By continuing to use our website we assume you agree to the placement of these cookies.
To learn more, you can find in our Privacy Policy.