Ahmad Alhawarat
Isra University
Image restorationRegular polygonAlgorithmGradient methodCPU timeOptimization problemGradient descentSet (abstract data type)Nonlinear conjugate gradient methodScale (ratio)Descent (mathematics)Applied mathematicsStandard testUnconstrained optimizationSecond derivativeMathematicsComputer scienceComputationCentral processing unitConvergence (routing)Conjugate gradient methodLine searchProperty (programming)Robustness (computer science)
12Publications
5H-index
38Citations
Publications 15
Newest
#1Ahmad Alhawarat (UMT: Universiti Malaysia Terengganu)H-Index: 5
Last. Shahrina Ismail (USIM: Universiti Sains Islam Malaysia)H-Index: 1
view all 5 authors...
The conjugate gradient (CG) method is a useful tool for obtaining the optimum point for unconstrained optimization problems since it does not require a second derivative or its approximations. Moreover, the conjugate gradient method can be applied in many fields such as machine learning, deep learning, neural network, and many others. This paper constructs a four-term conjugate gradient method that satisfies the descent property and convergence properties to obtain the stationary point. The new ...
Source
#1Ahmad Alhawarat (UMT: Universiti Malaysia Terengganu)H-Index: 5
#2Ghaliah Alhamzi (Islamic University)
Last. Zabidin Salleh (UMT: Universiti Malaysia Terengganu)H-Index: 11
view all 0 authors...
Source
#1Zabidin SallehH-Index: 11
Last. Ahmad AlhawaratH-Index: 5
view all 4 authors...
The conjugate gradient method is one of the most popular methods to solve large-scale unconstrained optimization problems since it does not require the second derivative, such as Newton’s method or approximations. Moreover, the conjugate gradient method can be applied in many fields such as neural networks, image restoration, etc. Many complicated methods are proposed to solve these optimization functions in two or three terms. In this paper, we propose a simple, easy, efficient, and robust conj...
Source
The conjugate gradient is a useful tool in solving large- and small-scale unconstrained optimization problems. In addition, the conjugate gradient method can be applied in many fields, such as engineering, medical research, and computer science. In this paper, a convex combination of two different search directions is proposed. The new combination satisfies the sufficient descent condition and the convergence analysis. Moreover, a new conjugate gradient formula is proposed. The new formula satis...
Source
Source
#1Ahmad AlhawaratH-Index: 5
#2Thoi Trung NguyenH-Index: 1
Last. Zabidin SallehH-Index: 11
view all 4 authors...
To find a solution of unconstrained optimization problems, we normally use a conjugate gradient (CG) method since it does not cost memory or storage of second derivative like Newton’s method or Broyden–Fletcher–Goldfarb–Shanno (BFGS) method. Recently, a new modification of Polak and Ribiere method was proposed with new restart condition to give a so-call AZPRP method. In this paper, we propose a new modification of AZPRP CG method to solve large-scale unconstrained optimization problems based on...
Source
#2Mustafa MamatH-Index: 16
Last. Mohd RivaieH-Index: 8
view all 4 authors...
The most well-known technique or method in unconstrained optimization is the conjugate gradient method. It is used to get the greatest solution for the unconstrained optimization problems. This method is used in many fields especially, computer science, and engineering due to its convergence speed, simplicity, and the low memory requirements. A new modified conjugate gradient method is presented in this paper. This method is proved with the strong Wolfe-Powell (SWP) line search that it possesses...
Source
#1Bakhtawar Baluch (UMT: Universiti Malaysia Terengganu)H-Index: 2
#2Zabidin Salleh (UMT: Universiti Malaysia Terengganu)H-Index: 11
Last. Ahmad Alhawarat (Isra University)H-Index: 5
view all 3 authors...
This paper describes a modified three-term Hestenes–Stiefel (HS) method. The original HS method is the earliest conjugate gradient method. Although the HS method achieves global convergence using an exact line search, this is not guaranteed in the case of an inexact line search. In addition, the HS method does not usually satisfy the descent property. Our modified three-term conjugate gradient method possesses a sufficient descent property regardless of the type of line search and guarantees glo...
Source
The Conjugate Gradient (CG) methods play an important role in solving large-scale unconstrained optimization problems. Several studies have been recently devoted to improving and modifying these methods in relation to efficiency and robustness. In this paper, a new parameter of CG method has been proposed. The new parameter possesses global convergence properties under the Strong Wolfe-Powell (SWP) line search. The numerical results show that the proposed formula is more efficient and robust com...
Source
#1Ahmad Alhawarat (UMT: Universiti Malaysia Terengganu)H-Index: 5
#2Zabidin Salleh (UMT: Universiti Malaysia Terengganu)H-Index: 11
Last. Mohd Rivaie (UiTM: Universiti Teknologi MARA)H-Index: 8
view all 4 authors...
The conjugate gradient (CG) method is one of the most popular methods for solving large-scale unconstrained optimization problems. In this paper, a new modified version of the CG formula that was introduced by Polak, Ribiere, and Polyak is proposed for problems that are bounded below and have a Lipschitz-continuous gradient. The new parameter provides global convergence properties when the strong Wolfe-Powell (SWP) line search or the weak Wolfe-Powell (WWP) line search is employed. A proof of a ...
Source
This website uses cookies.
We use cookies to improve your online experience. By continuing to use our website we assume you agree to the placement of these cookies.
To learn more, you can find in our Privacy Policy.