Economics at your fingertips  

Least squares approximation with a diverging number of parameters

Chenlei Leng and Bo Li ()

Statistics & Probability Letters, 2010, vol. 80, issue 3-4, 254-261

Abstract: Regularized regression with the l1 penalty is a popular approach for variable selection and coefficient estimation. For a unified treatment of the l1-constrained model selection, Wang and Leng (2007) proposed the least squares approximation method (LSA) for a fixed dimension. LSA makes use of a quadratic expansion of the loss function and takes full advantage of the fast Lasso algorithm in Efron et al. (2004). In this paper, we extend the fixed dimension LSA to the situation with a diverging number of parameters. We show that LSA possesses the oracle properties under appropriate conditions when the number of variables grows with the sample size. We propose a new tuning parameter selection method which achieves the oracle properties. Extensive simulation studies confirmed the theoretical results.

Date: 2010
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (3) Track citations by RSS feed

Downloads: (external link)
Full text for ScienceDirect subscribers only

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link:

Ordering information: This journal article can be ordered from
https://shop.elsevie ... _01_ooc_1&version=01

Access Statistics for this article

Statistics & Probability Letters is currently edited by Somnath Datta and Hira L. Koul

More articles in Statistics & Probability Letters from Elsevier
Bibliographic data for series maintained by Nithya Sathishkumar ().

Page updated 2021-04-09
Handle: RePEc:eee:stapro:v:80:y:2010:i:3-4:p:254-261