EconPapers    
Economics at your fingertips  
 

How Certain are You of Your Minimum AIC or BIC Values?

I.M.L. Nadeesha Jayaweera () and A. Alexandre Trindade ()
Additional contact information
I.M.L. Nadeesha Jayaweera: Texas Tech University
A. Alexandre Trindade: Texas Tech University

Sankhya A: The Indian Journal of Statistics, 2024, vol. 86, issue 2, No 8, 880-919

Abstract: Abstract In choosing a candidate model in likelihood-based inference by minimizing an information criterion, the practitioner is often faced with the difficult task of deciding how far up the ranked list to look. Motivated by this pragmatic necessity, we derive an approximation to the quantiles of a generalized (model selection) information criterion (ZIC), defined as a criterion for which the limit in probability is identical to that of the normalized log-likelihood, and which includes common special cases such as AIC and BIC. The method starts from the joint asymptotic normality of the ZIC values, and proceeds by deriving the (asymptotically) exact distribution of the minimum, which can be efficiently (numerically) computed. High quantiles can then be obtained by inverting this distribution function, resulting in what we call a certainty envelope (CE) of plausible models, intended to provide a heuristic upper bound on the location of the actual minimum. The theory is established for three data settings of perennial classical interest: (i) independent and identically distributed, (ii) regression, and (iii) time series. The development in the latter two cases invokes Lindeberg-Feller type conditions for, respectively, normalized: sums of conditional distributions and quadratic forms, in the observations. The performance of the methodology is examined on simulated data by assessing CE nominal coverage probabilities, and comparing them to the bootstrap. Both approaches give coverages close to nominal for large samples, but the bootstrap is on average two orders of magnitude slower. Finally, we hint at the possibility of producing confidence intervals for individual parameters by pivoting the distribution of the minimum ZIC, thus naturally accounting for post-model selection uncertainty.

Keywords: Maximum likelihood; model selection; Kullback-Leibler discrepancy; asymptotic normality; post-model selection inference; Primary 62F12; Secondary 62F40 (search for similar items in EconPapers)
Date: 2024
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
http://link.springer.com/10.1007/s13171-024-00344-y Abstract (text/html)
Access to the full text of the articles in this series is restricted.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:sankha:v:86:y:2024:i:2:d:10.1007_s13171-024-00344-y

Ordering information: This journal article can be ordered from
http://www.springer.com/statistics/journal/13171

DOI: 10.1007/s13171-024-00344-y

Access Statistics for this article

Sankhya A: The Indian Journal of Statistics is currently edited by Dipak Dey

More articles in Sankhya A: The Indian Journal of Statistics from Springer, Indian Statistical Institute
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-20
Handle: RePEc:spr:sankha:v:86:y:2024:i:2:d:10.1007_s13171-024-00344-y