A Spectral Conjugate Gradient Method with Descent Property

Published on Feb 19, 2020
· DOI :10.3390/MATH8020280
Jinbao Jian1
Estimated H-index: 1
Lin Yang1
Estimated H-index: 1
+ 2 AuthorsMeixing Liu1
Estimated H-index: 1
Spectral conjugate gradient method (SCGM) is an important generalization of the conjugate gradient method (CGM), and it is also one of the effective numerical methods for large-scale unconstrained optimization. The designing for the spectral parameter and the conjugate parameter in SCGM is a core work. And the aim of this paper is to propose a new and effective alternative method for these two parameters. First, motivated by the strong Wolfe line search requirement, we design a new spectral parameter. Second, we propose a hybrid conjugate parameter. Such a way for yielding the two parameters can ensure that the search directions always possess descent property without depending on any line search rule. As a result, a new SCGM with the standard Wolfe line search is proposed. Under usual assumptions, the global convergence of the proposed SCGM is proved. Finally, by testing 108 test instances from 2 to 1,000,000 dimensions in the CUTE library and other classic test collections, a large number of numerical experiments, comparing with both SCGMs and CGMs, for the presented SCGM are executed. The detail results and their corresponding performance profiles are reported, which show that the proposed SCGM is effective and promising.
📖 Papers frequently viewed together
#1Xiangli Li (GUET: Guilin University of Electronic Technology)H-Index: 6
#2Juanjuan Shi (GUET: Guilin University of Electronic Technology)H-Index: 1
Last. Jianglan Yu (GUET: Guilin University of Electronic Technology)H-Index: 1
view all 4 authors...
Abstract The spectral conjugate gradient method is an effective method for large-scale unconstrained optimization problems. In this paper, based on Quasi-Newton direction and Quasi-Newton equation, a new spectral conjugate gradient method is proposed. This method is motivated by the three-term modified Polak-Ribiyre-Polyak (PRP) method and spectral parameters. The global convergence of algorithm is proved for general functions under a strong Wolfe line search. Numerical results show that the new...
#1Xianzhen Jiang (Yulin Normal University)H-Index: 1
#2Jinbao Jian (GXUN: Guangxi University for Nationalities)H-Index: 18
Abstract The conjugate gradient methods (CGMs) are very effective iterative methods for solving large-scale unconstrained optimization. The aim of this work is to improve the Fletcher–Reeves and Dai–Yuan CGMs. First, based on the conjugate parameters of the Fletcher–Reeves (FR) method and the Dai–Yuan (DY) method, and combining the second inequality of the strong Wolfe line search, two new conjugate parameters are constructed. Second, using the two new conjugate parameters, another FR type conju...
#1J. K. Liu (CTGU: Chongqing Three Gorges University)H-Index: 4
#2Yuming Feng (CTGU: Chongqing Three Gorges University)H-Index: 11
Last. Limin Zou (CTGU: Chongqing Three Gorges University)H-Index: 10
view all 3 authors...
Abstract This paper establishes a spectral conjugate gradient method for solving unconstrained optimization problems, where the conjugate parameter and the spectral parameter satisfy a restrictive relationship. The search direction is sufficient descent without restarts in per-iteration. Moreover, this feature is independent of any line searches. Under the standard Wolfe line searches, the global convergence of the proposed method is proved when | β k | ≤ β k F R holds. The preliminary numerical...
#1Min SunH-Index: 1
#2Jing Liu (Zhejiang University of Finance and Economics)H-Index: 4
In this paper, three modified Polak-Ribiere-Polyak (PRP) conjugate gradient methods for unconstrained optimization are proposed. They are based on the two-term PRP method proposed by Cheng (Numer. Funct. Anal. Optim. 28:1217-1230, 2007), the three-term PRP method proposed by Zhang et al. (IMA J. Numer. Anal. 26:629-640, 2006), and the descent PRP method proposed by Yu et al. (Optim. Methods Softw. 23:275-293, 2008). These modified methods possess the sufficient descent property without any line ...
#1C. X. Kou (Beijing University of Posts and Telecommunications)H-Index: 1
#2Yu-Hong Dai (CAS: Chinese Academy of Sciences)H-Index: 37
The introduction of quasi-Newton and nonlinear conjugate gradient methods revolutionized the field of nonlinear optimization. The self-scaling memoryless Broyden---Fletcher---Goldfarb---Shanno (SSML-BFGS) method by Perry (Disscussion Paper 269, 1977) and Shanno (SIAM J Numer Anal, 15, 1247---1257, 1978) provided a good understanding about the relationship between the two classes of methods. Based on the SSML-BFGS method, new conjugate gradient algorithms, called CG_DESCENT and CGOPT, have been p...
#1Jinbao Jian (Yulin Normal University)H-Index: 18
#2Lin Han (Xida: Guangxi University)H-Index: 1
Last. Xianzhen Jiang (Yulin Normal University)H-Index: 1
view all 3 authors...
Abstract In this paper, based on some famous previous conjugate gradient methods, a new hybrid conjugate gradient method was presented for unconstrained optimization. The proposed method can generate decent directions at every iteration, moreover, this property is independent of the steplength line search. Under the Wolfe line search, the proposed method possesses global convergence. Medium-scale numerical experiments and their performance profiles are reported, which show that the proposed meth...
New accelerated nonlinear conjugate gradient algorithms which are mainly modifications of Dai and Yuan's for unconstrained optimization are proposed. Using the exact line search, the algorithm reduces to the Dai and Yuan conjugate gradient computational scheme. For inexact line search the algorithm satisfies the sufficient descent condition. Since the step lengths in conjugate gradient algorithms may differ from 1 by two orders of magnitude and tend to vary in a very unpredictable manner, the al...
In this paper a new hybrid conjugate gradient algorithm is proposed and analyzed. The parameter βk is computed as a convex combination of the Polak-Ribiere-Polyak and the Dai-Yuan conjugate gradient algorithms, i.e. βkN=(1−θk)βkPRP+θkβkDY. The parameter θk in the convex combination is computed in such a way that the conjugacy condition is satisfied, independently of the line search. The line search uses the standard Wolfe conditions. The algorithm generates descent directions and when the iterat...
#1Neculai AndreiH-Index: 21
A collection of unconstrained optimization test functions is presented. The purpose of this collection is to give to the optimization community a large number of general test functions to be used in testing the unconstrained optimization algorithms and comparisons studies. For each function we give its algebraic expression and the standard initial point. Some of the test fnctions are from the CUTE collection established by Bongartz, Conn, Gould and Toint, (1995), others are from More, Garbow and...
#1Li ZhangH-Index: 3
#2Weijun ZhouH-Index: 5
Last. Dong-Hui LiH-Index: 17
view all 3 authors...
In this paper, we propose a modified Polak-Ribiere-Polyak (PRP) conjugate gradient method. An attractive property of the proposed method is that the direction generated by the method is always a descent direction for the objective 'function. This property is independent of the line search used. Moreover, if exact line search is used, the method reduces to the ordinary PRP method. Under appropriate conditions, we show that the modified PRP method with Armijo-type line search is globally convergen...
Cited By3
#1Maulana MalikH-Index: 4
#2Mustafa MamatH-Index: 16
Last. SukonoH-Index: 7
view all 5 authors...
The Spectral conjugate gradient method is an efficient method for solving large-scale unconstrained optimization problems. In this paper, we propose a new spectral conjugate gradient method in which performance is analyzed numerically. We establish the descent condition and global convergence property under some assumptions and the strong Wolfe line search. Numerical experiments to evaluate the method’s efficiency are conducted using 98 problems with various dimensions and initial points. The nu...
#1Siti Farhana HusinH-Index: 1
Last. Mohd RivaieH-Index: 8
view all 4 authors...
This study employs exact line search iterative algorithms for solving large scale unconstrained optimization problems in which the direction is a three-term modification of iterative method with two different scaled parameters. The objective of this research is to identify the effectiveness of the new directions both theoretically and numerically. Sufficient descent property and global convergence analysis of the suggested methods are established. For numerical experiment purposes, the methods a...
This website uses cookies.
We use cookies to improve your online experience. By continuing to use our website we assume you agree to the placement of these cookies.
To learn more, you can find in our Privacy Policy.