Sparsity and smoothness via the fused lasso
Robert Tibshirani,
Michael Saunders,
Saharon Rosset,
Ji Zhu and
Keith Knight
Journal of the Royal Statistical Society Series B, 2005, vol. 67, issue 1, 91-108
Abstract:
Summary. The lasso penalizes a least squares regression by the sum of the absolute values (L1‐norm) of the coefficients. The form of this penalty encourages sparse solutions (with many coefficients equal to 0). We propose the ‘fused lasso’, a generalization that is designed for problems with features that can be ordered in some meaningful way. The fused lasso penalizes the L1‐norm of both the coefficients and their successive differences. Thus it encourages sparsity of the coefficients and also sparsity of their differences—i.e. local constancy of the coefficient profile. The fused lasso is especially useful when the number of features p is much greater than N, the sample size. The technique is also extended to the ‘hinge’ loss function that underlies the support vector classifier. We illustrate the methods on examples from protein mass spectroscopy and gene expression data.
Date: 2005
References: View complete reference list from CitEc
Citations: View citations in EconPapers (241)
Downloads: (external link)
https://doi.org/10.1111/j.1467-9868.2005.00490.x
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:bla:jorssb:v:67:y:2005:i:1:p:91-108
Ordering information: This journal article can be ordered from
http://ordering.onli ... 1111/(ISSN)1467-9868
Access Statistics for this article
Journal of the Royal Statistical Society Series B is currently edited by P. Fryzlewicz and I. Van Keilegom
More articles in Journal of the Royal Statistical Society Series B from Royal Statistical Society Contact information at EDIRC.
Bibliographic data for series maintained by Wiley Content Delivery ().