The Dantzig Discriminant Analysis with High Dimensional Data
Yanli Zhang,
Lei Huo,
Lu Lin and
Yunhui Zeng
Communications in Statistics - Theory and Methods, 2014, vol. 43, issue 23, 5012-5025
Abstract:
It is well known that linear discriminant analysis (LDA) works well and is asymptotically optimal under fixed-p-large-n situations. But Bickel and Levina (2004) showed that the LDA is as bad as random guessing when p > n. This article studies the sparse discriminant analysis via Dantzig penalized least squares. Our method avoids estimating the high-dimensional covariance matrix and does not need the sparsity assumption on the inverse of the covariance matrix. We show that the new discriminant analysis is asymptotically optimal theoretically. Simulation and real data studies show that the classifier performs better than the existing sparse methods.
Date: 2014
References: Add references at CitEc
Citations:
Downloads: (external link)
http://hdl.handle.net/10.1080/03610926.2013.878359 (text/html)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:taf:lstaxx:v:43:y:2014:i:23:p:5012-5025
Ordering information: This journal article can be ordered from
http://www.tandfonline.com/pricing/journal/lsta20
DOI: 10.1080/03610926.2013.878359
Access Statistics for this article
Communications in Statistics - Theory and Methods is currently edited by Debbie Iscoe
More articles in Communications in Statistics - Theory and Methods from Taylor & Francis Journals
Bibliographic data for series maintained by Chris Longhurst ().