Bootstrapping some GLM and survival regression variable selection estimators
Rasanji C. Rathnayake and
David J. Olive
Communications in Statistics - Theory and Methods, 2023, vol. 52, issue 8, 2625-2645
Abstract:
Inference after variable selection is a very important problem. This paper derives the asymptotic distribution of many variable selection estimators, such as forward selection and backward elimination, when the number of predictors is fixed. Under strong regularity conditions, the variable selection estimators are asymptotically normal, but generally the asymptotic distribution is a nonnormal mixture distribution. The theory shows that the lasso variable selection and elastic net variable selection estimators are n consistent estimators of β when lasso and elastic net are consistent estimators of β. A bootstrap technique to eliminate selection bias is to fit the variable selection estimator β̂VS* to a bootstrap sample to find a submodel, then draw another bootstrap sample and fit the same submodel to get the bootstrap estimator β̂MIX*. Bootstrap confidence regions were used for hypothesis testing.
Date: 2023
References: Add references at CitEc
Citations:
Downloads: (external link)
http://hdl.handle.net/10.1080/03610926.2021.1955389 (text/html)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:taf:lstaxx:v:52:y:2023:i:8:p:2625-2645
Ordering information: This journal article can be ordered from
http://www.tandfonline.com/pricing/journal/lsta20
DOI: 10.1080/03610926.2021.1955389
Access Statistics for this article
Communications in Statistics - Theory and Methods is currently edited by Debbie Iscoe
More articles in Communications in Statistics - Theory and Methods from Taylor & Francis Journals
Bibliographic data for series maintained by Chris Longhurst ().