lassopack: Model selection and prediction with regularized regression in Stata
Achim Ahrens,
Christian Hansen and
Mark Schaffer ()
Papers from arXiv.org
Abstract:
This article introduces lassopack, a suite of programs for regularized regression in Stata. lassopack implements lasso, square-root lasso, elastic net, ridge regression, adaptive lasso and post-estimation OLS. The methods are suitable for the high-dimensional setting where the number of predictors $p$ may be large and possibly greater than the number of observations, $n$. We offer three different approaches for selecting the penalization (`tuning') parameters: information criteria (implemented in lasso2), $K$-fold cross-validation and $h$-step ahead rolling cross-validation for cross-section, panel and time-series data (cvlasso), and theory-driven (`rigorous') penalization for the lasso and square-root lasso for cross-section and panel data (rlasso). We discuss the theoretical framework and practical considerations for each approach. We also present Monte Carlo results to compare the performance of the penalization approaches.
Date: 2019-01
New Economics Papers: this item is included in nep-big and nep-ets
References: Add references at CitEc
Citations: View citations in EconPapers (15)
Downloads: (external link)
http://arxiv.org/pdf/1901.05397 Latest version (application/pdf)
Related works:
Journal Article: lassopack: Model selection and prediction with regularized regression in Stata (2020)
Working Paper: lassopack: Model Selection and Prediction with Regularized Regression in Stata (2019)
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:arx:papers:1901.05397
Access Statistics for this paper
More papers in Papers from arXiv.org
Bibliographic data for series maintained by arXiv administrators ().