Fitting sparse linear models under the sufficient and necessary condition for model identification
Jian Huang,
Yuling Jiao,
Lican Kang and
Yanyan Liu
Statistics & Probability Letters, 2021, vol. 168, issue C
Abstract:
We propose an enhanced support detection and root finding approach (ESDAR) to variable selection in sparse linear models. ESDAR is motivated from the KKT conditions for the ℓ0 penalized regression. In ESDAR, we introduce a step size to balance the primal and dual variables in determining the support of the solution. We establish a sharp oracle error bound and an oracle support recovery property for the solution sequence generated by ESDAR under the weakest possible condition on the design matrix that is sufficient and necessary for the model to be identifiable. The conditions for the oracle results we obtained are weaker than those for Lasso and concave selection methods including SCAD and MCP in the literature.
Keywords: Sparse linear model; ℓ0-penalty; Identifiability; Oracle inequality; Support recovery (search for similar items in EconPapers)
Date: 2021
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (2)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0167715220302285
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:stapro:v:168:y:2021:i:c:s0167715220302285
Ordering information: This journal article can be ordered from
http://www.elsevier.com/wps/find/supportfaq.cws_home/regional
https://shop.elsevie ... _01_ooc_1&version=01
DOI: 10.1016/j.spl.2020.108925
Access Statistics for this article
Statistics & Probability Letters is currently edited by Somnath Datta and Hira L. Koul
More articles in Statistics & Probability Letters from Elsevier
Bibliographic data for series maintained by Catherine Liu ().