A New Subspace Minimization Conjugate Gradient Method for Unconstrained Minimization
Zexian Liu (),
Yan Ni (),
Hongwei Liu () and
Wumei Sun ()
Additional contact information
Zexian Liu: Guizhou University
Yan Ni: Guizhou University
Hongwei Liu: Xidian University
Wumei Sun: Xi’an University of Science and Technology
Journal of Optimization Theory and Applications, 2024, vol. 200, issue 2, No 14, 820-851
Abstract:
Abstract Subspace minimization conjugate gradient (SMCG) methods are a class of quite efficient iterative methods for unconstrained optimization and have received increasing attention recently. The search directions of SMCG methods are generated by minimizing an approximate model with the approximate matrix $$B_k$$ B k over the two-dimensional subspace spanned by the current gradient $$g_k $$ g k and the latest step. The main drawback of SMCG methods is that the parameter $$g_k^TB_kg_k $$ g k T B k g k in the search directions must be determined when calculating the search directions. The parameter $$g_k^TB_kg_k $$ g k T B k g k is crucial to SMCG methods and is difficult to be determined properly. An alternative solution for this drawback might be to exploit a new way to derive SMCG methods independent of $$g_k^TB_kg_k$$ g k T B k g k . The projection technique has been used successfully to derive conjugate gradient directions such as the Dai–Kou conjugate gradient direction (Dai and Kou in SIAM J Optim 23(1):296–320, 2013). Motivated by the above two observations, we use a projection technique to derive a new SMCG method independent of $$g_k^TB_kg_k$$ g k T B k g k . More specifically, we project the search direction of memoryless quasi-Newton method into the above two-dimensional subspace and derive a new search direction, which is proved to be descent. Remarkably, the proposed method without any line search enjoys the finite termination property for two-dimensional strictly convex quadratic functions. An adaptive scaling factor in the search direction is exploited based on the finite termination property. The proposed method does not need to determine the parameter $$g_k^TB_kg_k$$ g k T B k g k and can be regarded as an extension of the Dai–Kou conjugate gradient method. The global convergence of the proposed method is established under the suitable assumptions. Numerical comparisons on the 147 test functions from the CUTEst library indicate that the proposed method is very promising.
Keywords: Conjugate gradient method; Subspace minimization; Memoryless quasi-Newton method; Finite termination property; Global convergence; 90C06; 65K (search for similar items in EconPapers)
Date: 2024
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
http://link.springer.com/10.1007/s10957-023-02325-x Abstract (text/html)
Access to the full text of the articles in this series is restricted.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:joptap:v:200:y:2024:i:2:d:10.1007_s10957-023-02325-x
Ordering information: This journal article can be ordered from
http://www.springer. ... cs/journal/10957/PS2
DOI: 10.1007/s10957-023-02325-x
Access Statistics for this article
Journal of Optimization Theory and Applications is currently edited by Franco Giannessi and David G. Hull
More articles in Journal of Optimization Theory and Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().