Limited memory BFGS method based on a high-order tensor model
Fahimeh Biglari () and
Ali Ebadian ()
Computational Optimization and Applications, 2015, vol. 60, issue 2, 413-422
Abstract:
This paper is aimed to employ a modified quasi-Newton equation in the framework of the limited memory BFGS method to solve large-scale unconstrained optimization problems. The modified secant equation is derived by means of a forth order tensor model to improve the curvature information of the objective function. The global and local convergence properties of the modified LBFGS method, on uniformly convex problems are also studied. The numerical results indicate that the proposed limited memory method is superior to the standard LBFGS method. Copyright Springer Science+Business Media New York 2015
Keywords: Large scale nonlinear optimization; Modified limited memory quasi-Newton method; Curvature approximation; Modified quasi-Newton equation; R-linear convergence; 90C06; 90C53 (search for similar items in EconPapers)
Date: 2015
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
http://hdl.handle.net/10.1007/s10589-014-9678-4 (text/html)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:coopap:v:60:y:2015:i:2:p:413-422
Ordering information: This journal article can be ordered from
http://www.springer.com/math/journal/10589
DOI: 10.1007/s10589-014-9678-4
Access Statistics for this article
Computational Optimization and Applications is currently edited by William W. Hager
More articles in Computational Optimization and Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().