A Modified Dai–Liao Conjugate Gradient Method Based on a Scalar Matrix Approximation of Hessian and Its Application
Branislav Ivanov,
Gradimir V. Milovanović,
Predrag S. Stanimirović,
Aliyu Muhammed Awwal,
Lev A. Kazakovtsev,
Vladimir N. Krutikov and
Xian-Ming Gu
Journal of Mathematics, 2023, vol. 2023, 1-20
Abstract:
We introduce and investigate proper accelerations of the Dai–Liao (DL) conjugate gradient (CG) family of iterations for solving large-scale unconstrained optimization problems. The improvements are based on appropriate modifications of the CG update parameter in DL conjugate gradient methods. The leading idea is to combine search directions in accelerated gradient descent methods, defined based on the Hessian approximation by an appropriate diagonal matrix in quasi-Newton methods, with search directions in DL-type CG methods. The global convergence of the modified Dai–Liao conjugate gradient method has been proved on the set of uniformly convex functions. The efficiency and robustness of the newly presented methods are confirmed in comparison with similar methods, analyzing numerical results concerning the CPU time, a number of function evaluations, and the number of iterative steps. The proposed method is successfully applied to deal with an optimization problem arising in 2D robotic motion control.
Date: 2023
References: Add references at CitEc
Citations:
Downloads: (external link)
http://downloads.hindawi.com/journals/jmath/2023/9945581.pdf (application/pdf)
http://downloads.hindawi.com/journals/jmath/2023/9945581.xml (application/xml)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:hin:jjmath:9945581
DOI: 10.1155/2023/9945581
Access Statistics for this article
More articles in Journal of Mathematics from Hindawi
Bibliographic data for series maintained by Mohamed Abdelhakeem ().