On cross‐validation for sparse reduced rank regression
Yiyuan She and
Hoang Tran
Journal of the Royal Statistical Society Series B, 2019, vol. 81, issue 1, 145-161
Abstract:
In high dimensional data analysis, regularization methods pursuing sparsity and/or low rank have received much attention recently. To provide a proper amount of shrinkage, it is typical to use a grid search and a model comparison criterion to find the optimal regularization parameters. However, we show that fixing the parameters across all folds may result in an inconsistency issue, and it is more appropriate to cross‐validate projection–selection patterns to obtain the best coefficient estimate. Our in‐sample error studies in jointly sparse and rank deficient models lead to a new class of information criteria with four scale‐free forms to bypass the estimation of the noise level. By use of an identity, we propose a novel scale‐free calibration to help cross‐validation to achieve the minimax optimal error rate non‐asymptotically. Experiments support the efficacy of the methods proposed.
Date: 2019
References: Add references at CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
https://doi.org/10.1111/rssb.12295
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:bla:jorssb:v:81:y:2019:i:1:p:145-161
Ordering information: This journal article can be ordered from
http://ordering.onli ... 1111/(ISSN)1467-9868
Access Statistics for this article
Journal of the Royal Statistical Society Series B is currently edited by P. Fryzlewicz and I. Van Keilegom
More articles in Journal of the Royal Statistical Society Series B from Royal Statistical Society Contact information at EDIRC.
Bibliographic data for series maintained by Wiley Content Delivery ().