A Family of Hybrid Stochastic Conjugate Gradient Algorithms for Local and Global Minimization Problems
Khalid Abdulaziz Alnowibet,
Salem Mahdi,
Ahmad M. Alshamrani,
Karam M. Sallam and
Ali Wagdy Mohamed ()
Additional contact information
Khalid Abdulaziz Alnowibet: Statistics and Operations Research Department, College of Science, King Saud University, P.O. Box 2455, Riyadh 11451, Saudi Arabia
Salem Mahdi: Department of Mathematics & Computer Science, Faculty of Science, Alexandria University, Alexandria 21544, Egypt
Ahmad M. Alshamrani: Statistics and Operations Research Department, College of Science, King Saud University, P.O. Box 2455, Riyadh 11451, Saudi Arabia
Karam M. Sallam: School of IT and Systems, University of Canberra, Canberra, ACT 2601, Australia
Ali Wagdy Mohamed: Operations Research Department, Faculty of Graduate Studies for Statistical Research, Cairo University, Giza 12613, Egypt
Mathematics, 2022, vol. 10, issue 19, 1-37
Abstract:
This paper contains two main parts, Part I and Part II, which discuss the local and global minimization problems, respectively. In Part I, a fresh conjugate gradient (CG) technique is suggested and then combined with a line-search technique to obtain a globally convergent algorithm. The finite difference approximations approach is used to compute the approximate values of the first derivative of the function f . The convergence analysis of the suggested method is established. The comparisons between the performance of the new CG method and the performance of four other CG methods demonstrate that the proposed CG method is promising and competitive for finding a local optimum point. In Part II, three formulas are designed by which a group of solutions are generated. This set of random formulas is hybridized with the globally convergent CG algorithm to obtain a hybrid stochastic conjugate gradient algorithm denoted by HSSZH. The HSSZH algorithm finds the approximate value of the global solution of a global optimization problem. Five combined stochastic conjugate gradient algorithms are constructed. The performance profiles are used to assess and compare the rendition of the family of hybrid stochastic conjugate gradient algorithms. The comparison results between our proposed HSSZH algorithm and four other hybrid stochastic conjugate gradient techniques demonstrate that the suggested HSSZH method is competitive with, and in all cases superior to, the four algorithms in terms of the efficiency, reliability and effectiveness to find the approximate solution of the global optimization problem that contains a non-convex function.
Keywords: global optimization; unconstrained minimization; numerical approximations of gradients; meta-heuristics; stochastic parameters; conjugate gradient methods; efficient algorithm; performance profiles; comparisons; testing (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
https://www.mdpi.com/2227-7390/10/19/3595/pdf (application/pdf)
https://www.mdpi.com/2227-7390/10/19/3595/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:10:y:2022:i:19:p:3595-:d:931416
Access Statistics for this article
Mathematics is currently edited by Ms. Emma He
More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().