Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
Neculai Andrei
European Journal of Operational Research, 2010, vol. 204, issue 3, 410-420
Abstract:
An accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for solving unconstrained optimization problems is presented. The basic idea is to combine the scaled memoryless BFGS method and the preconditioning technique in the frame of the conjugate gradient method. The preconditioner, which is also a scaled memoryless BFGS matrix, is reset when the Beale-Powell restart criterion holds. The parameter scaling the gradient is selected as a spectral gradient. For the steplength computation the method has the advantage that in conjugate gradient algorithms the step lengths may differ from 1 by two order of magnitude and tend to vary unpredictably. Thus, we suggest an acceleration scheme able to improve the efficiency of the algorithm. Under common assumptions, the method is proved to be globally convergent. It is shown that for uniformly convex functions the convergence of the accelerated algorithm is still linear, but the reduction in the function values is significantly improved. In mild conditions the algorithm is globally convergent for strongly convex functions. Computational results for a set consisting of 750 unconstrained optimization test problems show that this new accelerated scaled conjugate gradient algorithm substantially outperforms known conjugate gradient methods: SCALCG [3], [4], [5] and [6], CONMIN by Shanno and Phua (1976, 1978) [42] and [43], Hestenes and Stiefel (1952) [25], Polak-Ribiére-Polyak (1969) [32] and [33], Dai and Yuan (2001) [17], Dai and Liao (2001) (t=1) [14], conjugate gradient with sufficient descent condition [7], hybrid Dai and Yuan (2001) [17], hybrid Dai and Yuan zero (2001) [17], CG_DESCENT by Hager and Zhang (2005, 2006) [22] and [23], as well as quasi-Newton LBFGS method [26] and truncated Newton method by Nash (1985) [27].
Keywords: Unconstrained; optimization; Conjugate; gradient; method; Spectral; gradient; method; Wolfe; line; search; BFGS; preconditioning (search for similar items in EconPapers)
Date: 2010
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (9)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0377-2217(09)00903-5
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:ejores:v:204:y:2010:i:3:p:410-420
Access Statistics for this article
European Journal of Operational Research is currently edited by Roman Slowinski, Jesus Artalejo, Jean-Charles. Billaut, Robert Dyson and Lorenzo Peccati
More articles in European Journal of Operational Research from Elsevier
Bibliographic data for series maintained by Catherine Liu ().