Ismail Mohd
Universiti Putra Malaysia
FinanceGradient methodMathematical optimizationMathematical analysisBusinessGradient descentNatural disasterDerivation of the conjugate gradient methodNonlinear conjugate gradient methodScale (ratio)Food pricesApplied mathematicsUnconstrained optimizationMathematicsComputer simulationFunction (mathematics)Control theoryConvergence (routing)Conjugate gradient methodNewton's methodLine searchConjugate residual method
88Publications
8H-index
282Citations
Publications 47
#1Yosza DasrilH-Index: 4
#2Ismail MohdH-Index: 8
Source
#1Ismail MohdH-Index: 8
#2Yosza Dasril (UTeM: Universiti Teknikal Malaysia Melaka)H-Index: 4
view all 4 authors...
It is generally known that almost all filled function methods for one-dimensional unconstrained global optimization problems have computational weaknesses. This paper introduces a relatively new parameter free filled function, which creates a non-ascending bridge from any local isolated minimizer to other first local isolated minimizer with lower or equal function value. The algorithm’s unprecedented function can be used to determine all extreme and inflection points between the two considered c...
Source
#1Herlina Napitupulu (UMT: Universiti Malaysia Terengganu)
#2Ismail MohdH-Index: 8
view all 5 authors...
Global optimization problem still becomes an interest due to the challenge of locating the global optimum of nonlinear objective function with multiple local minima. Two challenges on solving global optimization problem are; firstly how to reach the better minimizer from the current minimizer, and secondly how to decide that the obtained minimizer is the desired global minimizer. One of the recent considered deterministic easy applied methods, which concerned in the mentioned problems, is the fi...
Source
#1Herlina NapitupuluH-Index: 3
#2Ismail MohdH-Index: 8
Filled function method is one of deterministic methods for solving global minimization problems. Filled function algorithm method generally contains of two main phases. First phase is to obtain local minimizer of objective function, second is to obtain minimizer or saddle point of filled function. In the second phase, vector direction plays an important role on finding stationary point of filled function, by assist in escaping from neighborhood of current minimizer of objective function of the f...
2 CitationsSource
#2Mustafa MamatH-Index: 15
Last. Ismail MohdH-Index: 8
view all 4 authors...
Conjugate Gradient (CG) method has been enormously used to solve large scale unconstrained optimization problems due to the number of iteration, memory, CPU time, and convergence property, in this paper we proposed a new class of nonlinear conjugate gradient coefficient with global convergence properties proved by exact line search. The numerical results for our CG method new present an efficient numerical result when it compared with well-known formulas. Keywords—Conjugate gradient method, conj...
2 Citations
#2Mustafa MamatH-Index: 15
Last. Osman OmerH-Index: 2
view all 5 authors...
Conjugate gradient (CG) methods are essential for solving large-scale unconstrained optimization problems. Many of studies and modifications have been practiced to improve this method. In this paper, a new class of conjugate gradient coefficients (βk) with a new parameter m = ‖gk‖‖dk−1‖ that possess global convergence properties is presented. The global convergence and sufficient decent property result is established using inexact line searches to determine the step size of CG, denoted as ∝k. Nu...
2 CitationsSource
#1Norrlaili ShapieeH-Index: 3
#2Mohd RivaieH-Index: 7
Last. Ismail MohdH-Index: 2
view all 4 authors...
Conjugate gradient (CG) methods are important for large-scale unconstrained optimization due to its low memory requirements and global convergence properties. Numerous researches has been done to proposed new CG coefficients and to improve the efficiency. In this paper, we proposed a new CG coefficient based on the original Hestenes-Steifel CG coefficient. The global convergence result is established using exact line search. Most of our numerical results show that our method is very efficient wh...
6 CitationsSource
#2Mustafa MamatH-Index: 15
Last. Osman OmerH-Index: 2
view all 5 authors...
Conjugate gradient methods are effective in solving linear equations and solving non-linear optimization. In this work we compare our new conjugate gradient coefficient βk with classical formula under strong Wolfe line search; our method contains sufficient descent condition. Numerical results have shown that the new βk performs better than classical formula.Conjugate gradient methods are effective in solving linear equations and solving non-linear optimization. In this work we compare our new c...
11 CitationsSource
#2Mustafa MamatH-Index: 15
Last. Ismail MohdH-Index: 8
view all 4 authors...
The classical steepest descent (SD) method is known as one of the earliest and the best method to minimize a function. Even though the convergence rate is quite slow, but its simplicity has made it one of the easiest methods to be used and applied especially in the form of computer codes. In this paper, a new modification of SD method is proposed using a new search direction (dk) in the form of two parameters. Numerical results shows that this new SD has far superior convergence rate and more ef...
3 CitationsSource
#1Syazni ShoidH-Index: 2
#2Mohd RivaieH-Index: 7
Last. Ismail MohdH-Index: 8
view all 4 authors...
Conjugate gradient (CG) methods have been widely used as schemes to solve large-scale unconstrained optimization problems. Numerous studies and modifications have been done recently to improve this method. In this paper, we proposed a new type of CG coefficients (βk) by modification of Polak and Ribiere (PR) method. This new βk is shown to possess global convergence properties by using exact line searches. Performance comparisons are made with the four most common βk proposed by the early resear...
5 CitationsSource