A Brief Survey of Modern Optimization for Statisticians
Kenneth Lange,
Eric C. Chi and
Hua Zhou
International Statistical Review, 2014, vol. 82, issue 1, 46-70
Abstract:
type="main" xml:id="insr12022-abs-0001"> Modern computational statistics is turning more and more to high-dimensional optimization to handle the deluge of big data. Once a model is formulated, its parameters can be estimated by optimization. Because model parsimony is important, models routinely include non-differentiable penalty terms such as the lasso. This sober reality complicates minimization and maximization. Our broad survey stresses a few important principles in algorithm design. Rather than view these principles in isolation, it is more productive to mix and match them. A few well-chosen examples illustrate this point. Algorithm derivation is also emphasized, and theory is downplayed, particularly the abstractions of the convex calculus. Thus, our survey should be useful and accessible to a broad audience.
Date: 2014
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (3)
Downloads: (external link)
http://hdl.handle.net/10.1111/insr.12022 (text/html)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:bla:istatr:v:82:y:2014:i:1:p:46-70
Ordering information: This journal article can be ordered from
http://www.blackwell ... bs.asp?ref=0306-7734
Access Statistics for this article
International Statistical Review is currently edited by Eugene Seneta and Kees Zeelenberg
More articles in International Statistical Review from International Statistical Institute Contact information at EDIRC.
Bibliographic data for series maintained by Wiley Content Delivery ().