Post-hoc analyses in multiple regression based on prediction error
Rand Wilcox
Journal of Applied Statistics, 2008, vol. 35, issue 1, 9-17
Abstract:
A well-known problem in multiple regression is that it is possible to reject the hypothesis that all slope parameters are equal to zero, yet when applying the usual Student's T-test to the individual parameters, no significant differences are found. An alternative strategy is to estimate prediction error via the 0.632 bootstrap method for all models of interest and declare the parameters associated with the model that yields the smallest prediction error to differ from zero. The main results in this paper are that this latter strategy can have practical value versus Student's T; replacing squared error with absolute error can be beneficial in some situations and replacing least squares with an extension of the Theil-Sen estimator can substantially increase the probability of identifying the correct model under circumstances that are described.
Keywords: multiple comparisons; prediction error; bootstrap methods; robust regression (search for similar items in EconPapers)
Date: 2008
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
http://www.tandfonline.com/doi/abs/10.1080/02664760701683288 (text/html)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:taf:japsta:v:35:y:2008:i:1:p:9-17
Ordering information: This journal article can be ordered from
http://www.tandfonline.com/pricing/journal/CJAS20
DOI: 10.1080/02664760701683288
Access Statistics for this article
Journal of Applied Statistics is currently edited by Robert Aykroyd
More articles in Journal of Applied Statistics from Taylor & Francis Journals
Bibliographic data for series maintained by Chris Longhurst ().